Apache Airflow Data Engineer

  • Full-time

Company Description

Merkle is a leading data-driven, technology-enabled, global performance marketing agency that specializes in the delivery of unique, personalized customer experiences across platforms and devices. For more than 30 years, Fortune 1000 companies and leading nonprofit organizations have partnered with Merkle to maximize the value of their customer portfolios. The agency's heritage in data, technology, and analytics forms the foundation for its unmatched skills in understanding consumer insights that drive people-based marketing strategies. Its combined strengths in performance media, customer experience, customer relationship management, loyalty, and enterprise marketing technology drive improved marketing results and competitive advantage. With 9,600 employees, Merkle is headquartered in Columbia, Maryland, with 24 additional offices in the US and 29 offices in Europe and APAC. In 2016, the agency joined the Dentsu Aegis Network.

Job Description

The Data Engineer is responsible for designing, implementing, deploying, and supporting various data management technologies and architectures.  In partnership with business leaders, key stakeholders and cross-functional project teams, the Data Engineer will be an active contributor in a collaborative team structure and will have the opportunity to accelerate the delivery of and improve the quality of HelloWorld products providing increased operational excellence, a greater client experience and other strategic objectives. 

Responsibilities:

  • Perform data loads and optimize data for extraction and reporting use
  • Design and implement ETL jobs and transformations to populate a data warehouse
  • Maintain complex databases by performing appropriate database management functions (e.g., maintain space availability, rebuilds indexes, file cleanup, runs utilities to check database integrity) to ensure optimum capacity and application performance
  • Monitor, report, and analyze usage trends and statistical output in order to maintain quality control and high performance of the data retrieval from a database or other data storage

Qualifications

  •  2 - 5+ years of experience working in technology
  • Apache airflow experience is a must
  • Experience with database management systems, schema design, query optimization
  • Python for building and redefining data pipelines is required
  • Experience Designing Tables and Writing Queries within a SQL Environment
  • BS degree in Computer Science, Information Systems or equivalent experience
  • Excellent debugging, problem solving and testing skills
  • Experience in data warehouse concepts, ETL tools such as Informatica, Pentaho, Apache Airflow
  • Experience using reporting tools such as Power BI and Qlik
  • Good interpersonal/management skills and capable of working individually and as part of a team
  • Understanding of cloud technologies such as AWS
  • Awareness of Apache Hadoop, HDFS, Hive

Additional Information

Merkle fosters a diverse environment that encourages original thinking about our business and empowers us to communicate with a global world of customers. We embrace differences of opinion and diversity of thought as they help us challenge and refine our solutions. Merkle, as a best-in-class marketing agency, welcomes big ideas, and believes they can come from anywhere.

All your information will be kept confidential according to EEO guidelines.

FLSA Status: Exempt

Privacy Policy