2026-7857 Data Engineer

  • Full-time

Company Description

AGSI was incorporated in April 2016. We are committed to supporting the goals of Arch divisions through exceptional service delivery. We pride ourselves on maintaining flexibility and responsiveness to adapt to business unit and industry demands while focusing on sound project management. We are dedicated to growing and developing our employees as we build strong teams with strategic leadership.

Job Description

Schedule: Mid Shift

The Position:

This position develops, implements, and maintains software solutions that enable business operations to realise company goals & objectives. The incumbent performs analysis, design, coding, debugging, testing, and support of software application systems. He/she may be assigned to develop new applications, enhance existing applications and/or provide production support. The incumbent works independently on projects of moderate scope or complexity and receives detailed instructions on new and/or more complex assignments.

 

Job Responsibilities:

  • Design and develop data pipelines using Apache Airflow to orchestrate complex workflows and ensure reliable data delivery
  • Build and maintain transformation logic using dbt Core, supporting the infrastructure needed, implementing best practices for modular, tested, and documented analytics code
  • Develop and optimize data models in Snowflake, leveraging cloud data warehouse capabilities for performance and cost efficiency
  • Write complex SQL for data transformation, quality validation, and business logic implementation
  • Collaborate closely with the infrastructure team to ensure the data platform remains modern, well‑monitored, and fully optimized, with industry best practices consistently applied
  • Collaborate with analytics and business teams to understand requirements and translate them into scalable data solutions
  • Implement data quality checks, monitoring, and alerting to ensure data reliability
  • Document data pipelines, models, and processes for knowledge sharing
  • Optimize query performance and manage Snowflake resource utilization
  • Participate in code reviews and contribute to data engineering best practices

 

Qualifications

Required Skills:

  • 3+ years of experience in data engineering or related role
  • Strong proficiency in SQL with experience writing complex queries, CTEs, and window functions
  • Proficiency in Python for data engineering tasks, scripting, and automation
  • Hands-on experience with dbt (Core or Cloud) for data transformation and modeling
  • Experience orchestrating workflows with Apache Airflow or similar tools
  • Working knowledge of Snowflake or similar cloud data warehouses (Redshift, BigQuery)
  • Understanding of infrastructure requirements for data engineering, including deployment strategies, environment configuration, and resource management
  • Understanding of dimensional modeling and data warehouse design patterns
  • Experience with version control (Git) and CI/CD practices
  • Strong problem-solving skills and attention to data quality

 

Desired Skills:

  • Experience with data replication tools such as Qlik Replicate, Fivetran, AWS DMS, or similar CDC solutions
  • Experience setting up and managing infrastructure for dbt Core, including deployment automation, testing frameworks, and orchestration integration
  • Knowledge of real-time data streaming and event-driven architectures
  • Knowledge of containerization (Docker) and infrastructure as code (Terraform, CloudFormation)
  • Experience with cloud platforms (AWS, Azure, GCP)
  • Knowledge of data governance and security best practices
  • Familiarity with DataOps practices and testing frameworks
  • Understanding of software engineering principles and agile methodologies

Additional Information

  • Required knowledge and skills would typically be acquired through a Bachelors degree in computer science, business, or related field