Data Engineer 1

  • Full-time

Company Description

When you’re one of us, you get to run with the best. For decades, we’ve been helping marketers from the world’s top brands personalize experiences for millions of people with our cutting-edge technology, solutions and services. Epsilon’s best-in-class identity gives brands a clear, privacy-safe view of their customers, which they can use across our suite of digital media, messaging and loyalty solutions. We process 400+ billion consumer actions each day and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon India is now Great Place to Work-Certified™. Epsilon has also been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Positioned at the core of Publicis Groupe, Epsilon is a global company with more than 8,000 employees around the world. For more information, visit epsilon.com/apac or our LinkedIn page.

Job Description

Organization Objective/Purpose:

This position in the Engineering team under the Digital Experience organization. We drive the first mile of the customer experience through personalization of offers and content. We are currently on the lookout for a smart, highly driven software engineer.

You will be part of a team that is focused on building solutions, pipelines using latest software engineering design principles and tech stacks. You will also be expected to Identify, design, and implement improvements including re-designing infrastructure for greater scalability, optimizing data delivery and automate continuous integration and deployment processes/pipelines.

The incumbent is also expected to partner with various stakeholders, bring scientific rigor to design and develop high quality software.

She / He also must have excellent verbal and written communication skills and be comfortable working in an entrepreneurial, ‘startup’ environment within a larger company.

Brief Description of Role:

  • Develop solutions for Epsilon that will deliver high quality personalized recommendations across different channels to our customers
  • Working with Data science team to ensure seamless integration and support of machine learning models.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Develop end-to-end (Data/Dev/MLOps) pipelines based on in-depth understanding of cloud platforms, AI/ML lifecycle, and business problems to ensure solutions are delivered efficiently and sustainably.
  • Collaborate with other members of the team to ensure high quality deliverables
  • Learning and implementing the latest design patterns in software engineering

Data Management 

  • Experience with both structured and unstructured data, and Hadoop, Apache Spark, or similar technologies 
  • Good understanding of Data Modeling, Data Warehouse, Data Catalog concepts and tools
  • Experience with Data Lake architectures, and with combining structured and unstructured data into unified representations
  • Able to identify, join, explore, and examine data from multiple disparate sources and formats 
  • Ability to reduce large quantities of unstructured or formless data and get it into a form in which it can be analyzed 
  • Ability to deal with data imperfections such as missing values, outliers, inconsistent formatting, etc.

Software Development 

  • Ability to write code in programming languages such as Python/NodeJs, PySpark and shell script on Linux
  • Familiarity with software development methodology such as Agile/Scrum
  • Love to learn new technologies, keep abreast of the latest technologies within the cloud architecture, and drive your organization to adapt to emerging best practices

 

Qualifications

  • Bachelor’s Degree in Engineering and related field
  • Tech Stack: Python, PySpark, Micro services, Docker, Serverless Frameworks.
  • Knowledge on building ETL workflows/pipelines
  • Experience in relational and non-relational databases and SQL (NoSQL is a plus).
  • Knowledge on with Cloud technologies (AWS or Azure)
  • Familiarity with Airflow and MLFlow tools
  • Familiarity with automated unit/integration test frameworks
  • Knowledge of machine learning algorithms and concepts and implementation will be a plus
  • Good written and spoken communication skills, team player.
  • Strong analytic thought process and ability to interpret findings 

In addition, the candidate should have strong business acumen, and interpersonal and communication skills, yet also be able to work independently. He/she should be able to communicate findings and the way techniques work in a manner that all stakeholders, both technical and non-technical, will understand.

Additional Information

Conditions of Employment

All job offers are contingent upon successful completion of background checks.

 

Privacy Policy