Data Engineer (AWS)

  • Full-time

Company Description

We Dream. We Do. We Deliver.

As a full-service, data-driven customer experience transformation company, we partner with the Top 500 companies in Europe. Merkle Bulgaria was created out of a merger between LiveArea and Isobar - two leading full-service digital agencies.

Our 300+ digital enthusiasts are innovating the way brands are built, through providing expertise in Digital Transformation strategy, MarTech platforms, Creativity, UX, CRM, Data, Commerce, Mobile, Social Media, Intranet and CMS. We are part of the global Merkle brand, the largest brand within the dentsu group, who shares with us a network of over 66,000 passionate individuals in 146 countries.

Job Description

You will report to the Data Engineering Manager and will work with clients from multiple sectors, understand their specific requirements, collaborate with full range of services provided by wider Merkle and design technical solutions. You will typically work with a few clients across stages of maturity, allowing variety in your work and the opportunity to pick up new skills.

  • Use CI/CD tools to facilitate deployment of code to stage and production environments.
  • Participate on architecture of end-to-end solutions for our customers on AWS, Azure and other cloud platforms.
  • Maintain GIT repositories using Gitflow framework.
  • Collaborate on feature deliverables to meet milestones and quality expectations.
  • Communicate with the partners, vendors and technology subject matter experts.
  • Document implemented logic in a structured manner using Confluence; plan your activities using Agile methodology in Jira.
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs, like optimizing existing data delivery, re-designing infrastructure for greater scalability, etc.

Qualifications

Essential:

·         Implementation of Data Lake patterns in AWS

·         Understanding of Data Lake partitioning policies and role-based access controls

·         An understanding of data modelling, data structures, databases, and data ingestion and transformation

·         Cloud engineering and data engineering skills

·         Experience of Python, Spark, Scala or similar technologies

·         DevOps such as Continuous Integration, Automation, Infrastructures as Code inc. Terraform deployments.

·     CI / CD experience within ETL / data transformation environment

·         Lambda architecture

·         S3 Data processing

·         GitHub including Github Actions for deployment across 3 environments integrated with Client Website

·         CloudWatch monitoring skills.

·         Docker for local development

·         SQL database skills

·     Team work internally and with external clients.

·     Experience writing clean, organized code

·     Proficiency working in an Agile environment

 

Beneficial

  • SNS -> SQS -> Lambda transformations
  • Event-driven Message processing.

·         AWS certification / demonstratable path towards certification

·       2+ years of experience of RedShift / Amazon Aurora

·       2+ years of experience of ElastiCache /Redis

Additional Information

 

Benefits:

⛺ 5 weeks of vacation + 3 wellness days

❤️ 2 Volunteering days to share the kindness of your heart with others

⏰ Flexible working hours and home office

🎯 Covered certifications in Salesforce, Adobe, Microsoft.

🎓 Full access to Dentsu Academy and on-site learning sessions

🍹 Team events: company parties, monthly breakfasts, and pub quizzes

🥪 Snacks, and drinks at the office

💸 Referral bonus programme

💻 Laptop and equipment

 

#LI-Hybrid

Merkle is an equal opportunity employer. We do not discriminate based on sex, gender identity, race, colour, national origin, religion, sexual orientation, disabilities or any other protected basis because we believe the come from all walks of life. We aspire to foster a community in which diversity is valued in both our employees and our ideas.

 

Privacy Policy