Senior Data Engineer

  • Full-time
  • Department: Technology

Company Description

Who we are?
Become a Zider  member and join this amazing company that is on top of the e-commerce game! Join a company that is not only growing but having fun while doing it. We are a human centric organization with huge growth plans and with a purpose to help more and more people with little or no digital experience to start their online business (e-commerce), move their offline to online or grow their offline businesses even more by having an online presence.

Where are we coming from and where are we going?

Zid has had a tremendous grow over the last 5 years. From 5 people to 300+ people at present and counting. Our revenues have been increasing more than 3 times year on year and we have had a great success. We are an e-commerce SaSS platform, a fintech startup, a logistics and shipping consolidator and we understand and believe that technology and business go hand-in-hand.

Job Description

Overview
We are seeking an experienced Senior Data Engineer to join our growing team and help design and implement scalable and highly available analytics solutions for our Software as a Service (SaaS) e-commerce platform. The successful candidate will play a critical role in driving the development and maintenance of data infrastructure, enabling data-driven decision-making, and optimizing sales, customer behavior, and inventory management.

Responsibilities:

o Design, develop, and maintain data architecture that supports scalable and highly available
analytics requirements for the e-commerce platform.
o Implement efficient data ingestion and processing pipelines from multiple sources, such as user interactions, transactional data, and inventory updates.
o Leverage cloud infrastructure to ensure scalability, high availability, and cost-effectiveness of the data platform.
o Design and implement data models that accommodate the necessary entities, relationships, and attributes while considering scalability and high availability.
o Utilize data transformation tools such as dbt to build and maintain data transformation pipelines that prepare ingested data for analytics.
o Ensure data quality and consistency by implementing robust data validation and monitoring processes.
o Collaborate with data warehouse and data lake technologies to create scalable and highly available data storage solutions.

o Design and implement APIs that provide secure, efficient, scalable, and highly available access to analytics data for external applications or services.
o Work closely with data analysts, data scientists, and other stakeholders to understand and address their data requirements and challenges.
o Stay current with industry best practices and emerging technologies in data engineering,
analytics, and infrastructure.
 

Qualifications


o Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
o 5+ years of experience in data engineering, with a focus on developing scalable and highly available data solutions.
o Strong expertise in SQL and database design.
o Hands-on experience with cloud infrastructure, such as AWS, Azure, or GCP.
o Proficiency in ETL / ELT design and implementation, using tools like Airbyte, Airflow, Prefect, Spark, Kafka, etc.
o Fluency with at least one interpreted language, such as Python, and another compiled, such as Scala, C++, or Rust. You must also be familiar with version control systems like Git.
o Comfortable working on Unix environments.
o Proficiency in data transformation tools, such as dbt, and data validation frameworks like Great Expectations or Monte Carlo.
o Familiarity with containerization (Docker) and container orchestration (Kubernetes)
technologies.
o Experience with data warehouse and data lake technologies, such as Snowflake, Clickhouse, Redshift, BigQuery or the Hadoop Ecosystem.
o Strong understanding of API design and implementation, with a focus on security, efficiency, and high availability.

o Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams.

o Knowledge on message brokers like Redis, ZeroMQ, RabbitMQ, etc. is a plus.
o Knowledge of data management and data governance frameworks, such as DAMA, is a plus.
o Knowledge of data privacy best practices and related regulations, such as GDPR, CCPA or the new Saudi PDPL, is a plus.

 

Additional Information

What are we offering?

  • Competitive Salary
  • Holidays of 21 days + additional days given regularly
  • Training (we kicked the learning internally and recently completed an AWS Architecting course)
  • Career path clearly defined!
  • ZEA – Zid Entertainment – Fun Thursdays!