Data Engineer

  • Full-time

Company Description

BETSOL is a cloud-first digital transformation and data management company offering products and IT services to enterprises in over 40 countries. BETSOL team holds several engineering patents, is recognized with industry awards, and BETSOL maintains a net promoter score that is 2x the industry average.

BETSOL’s open source backup and recovery product line, Zmanda (Zmanda.com), delivers up to 50% savings in total cost of ownership (TCO) and best-in-class performance.

BETSOL Global IT Services (BETSOL.com) builds and supports end-to-end enterprise solutions, reducing time-to-market for its customers.

BETSOL offices are set against the vibrant backdrops of Broomfield, Colorado and Bangalore, India.

We take pride in being an employee-centric organization, offering comprehensive health insurance, competitive salaries, volunteer programs, and scholarship opportunities. Office amenities include a fitness center, cafe, and recreational facilities.

Learn more at betsol.com

Job Description

We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and ingestion frameworks that power analytics and business intelligence across the organization. This role will focus on delivering high value POCs to stabilize and build a strong foundation for an enterprise data platform, including developing custom ingestion, optimizing data workflows, and ensuring reliable data delivery into Snowflake or other cloud-based platforms. The Data Engineer will collaborate with analytics, product, and engineering teams to enable data-driven decision-making through robust, efficient, and secure data infrastructure.

Qualifications

• 5+ years of experience in data engineering or related roles.

• Successfully delivered multiple high-value POCs projects.

• Proficiency in Python for building data pipelines and automation scripts.

• Hands-on experience with dbt for data transformation and modeling.

• Strong expertise in Snowflake or similar cloud data warehouse platforms.

• Strong expertise developing in AWS, Azure, or GCP cloud storage for federated data lakes.

• Experience coding custom ingestion to load data from diverse sources.

• Familiarity with modern ETL/ELT tools and best practices.

• Understanding of cloud-based architecture and data security principles.

• Experience working in Agile development environments.

• Strong communication skills for cross-functional collaboration.

• Nice-to-have: Experience with modern data observability and data catalog tools.

Additional Information

The Competencies You Bring

• Technical Leadership: Ability to guide best practices for data engineering and pipeline design.

• Problem Solving: Skilled at diagnosing and resolving complex data workflow issues.

• Collaboration: Strong partnership skills across Analytics, Product, and Engineering teams.

• Innovation: Ability to identify and implement modern tools and techniques for data reliability and scalability.

• Attention to Detail: Ensuring data accuracy and integrity throughout ingestion and transformation processes.

Privacy NoticeImprint