DATA ENGINEER

  • Full-time

Company Description

🚀 Join Inetum – We're Hiring a DATA ENGINEER! 🚀

At Inetum, a leading international digital consultancy, we empower 27,000 professionals across 27 countries to shape their careers, foster innovation, and achieve work-life balance. Proudly certified as a Top Employer Europe 2024, we’re passionate about creating positive and impactful digital solutions.

Job Description

Hybrid work mode in Madrid. This service could imply a willingness to travel occasionally within Italy.

The main responsibilities are:

  • Migrate the existing Hadoop infrastructure to cloud infrastructure on Kubernetes Engine, COS, Spark as a service, and Airflow as a service.
  • Implement data transformation and quality to ensure data consistency and accuracy. Utilize programming languages such as Scala and SQL and tools like Spark for data transformation and enrichment operations.
  • Set up CI/CD pipelines to automate deployments, unit testing and development management.
  • Write and conduct unit and validation tests to ensure accuracy and integrity of code developed.
  • Automate data pipelines and streamline data ingestion through the implementation of different orchestrators and scheduling processes (Airflow as a Service mainly).
  • Writing technical documentation (specifications, operational documents) to ensure knowledge capitalization.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions.
  • Foster a culture of continuous learning and improvement within the team.

Qualifications

Good knowledge of:

  • Spark on Scala
  • CI/CD tools (Gitlab, Jenkins…)
  • HDFS and structured databases (SQL)
  • Apache Airflow
  • S3 storage/COS and parquet (and ORC) format
  • Strong knowledge of SQL and NoSQL databases
  • Hadoop

Full understanding of:

  • Oozie
  • Shell script
  • Software Development Lifecycle (SDLC) awareness
  • Agile principles and ceremonies
  • Design effective prompts to leverage Gen AI tools across IT domains (e.g., development, testing, data generation, documentation) during the development stage.

Some knowledge of:

  • Kubernetes containerization
  • Dremio as tool to virtualize data

Optionally/ as a plus:

  • Elasticsearch and Kibana
  • Streaming process (Kafka, event steam…)
  • HVault

Additional Information

BUSINESS AND TRANSVERSAL SKILLS

  • Knowledge of banking industry and processes
  • Problem-Solving and Decision-Making person
  • Business / IT relationship (including IT OPS)
  • Ability to understand, explain and support change
  • Ability to Deliver / Results driven
  • Ability to collaborate / Teamwork with data squads and business teams


LANGUAGES

  • English level B2 or higher
Privacy Notice