Data Engineer

  • A jornada completa

Descripción de la empresa

Inetum is an international agile digital consulting group. In the post-digital transformation era, our purpose is to enable each of the more than 27,000 people who are part of our team to continuously renew themselves, positively experiencing their own digital flow. With a presence in 26 countries, we promote flexible career paths, local innovation, and a healthy work-life balance. In addition, Inetum has been recognized as a Top Employer, a certification that endorses our commitment to well-being, professional development, and excellence in talent management.

Commitment to equality

At Inetum, we promote an inclusive and equitable work environment. All candidates will be considered regardless of gender, identity, sexual orientation, age, ethnicity, disability, or other conditions. Hiring decisions are based solely on skills, competencies, and values aligned with our organizational culture.

Descripción del empleo

As part of the Data Engineering team, you will be responsible for the design, development, and operation of large-scale data systems operating at petabyte scale. Your focus will be on building real-time data pipelines, streaming analytics, distributed big data platforms, and machine learning infrastructure. You will collaborate closely with engineers, product managers, BI developers, and architects to deliver scalable and robust technical solutions.

Key Responsibilities:

  • Design, develop, and optimize distributed systems and data pipelines for large-scale data processing
  • Ensure scalability, low latency, and fault tolerance in all systems
  • Build and maintain real-time streaming analytics solutions
  • Develop data processing layers using Java and/or Python
  • Implement and manage workflows using Airflow and GitHub
  • Write and optimize complex queries across large datasets
  • Develop and maintain map-reduce jobs
  • Support the implementation and operation of data pipelines and analytical solutions
  • Perform performance tuning on systems handling massive data volumes
  • Build REST API data services for data consumption
  • Collaborate in agile development environments

Requisitos

  • 6–8 years of experience in Big Data development
  • Proven expertise in data engineering and complex pipeline development
  • Strong proficiency in SQL
  • Experience with Big Data technologies: Hadoop, Hive, Kafka, Presto, Spark, HBase
  • Experience with cloud platforms: GCP, Azure (AWS is a plus)
  • Familiarity with relational and in-memory data stores: Oracle, Cassandra, Druid
  • Experience in RESTful API development
  • Hands-on experience with Airflow and Git
  • Conversational English proficiency

Preferred Qualifications:

  • Experience in retail data environments
  • Knowledge of microservices architecture
  • Exposure to agile methodologies (Scrum, Kanban)

Información adicional

What we offer

•    Continuing education programs and certifications.

•    Access to learning and professional development platforms.

•    Culture of innovation and collaboration.

•    Physical and emotional wellness programs.

•    Opportunities for growth in international projects.

•    Recognition and rewards for performance.

•     Base salary

•     Benefits above the legal minimum

•            Life insurance

•    Major medical insurance

•    Grocery vouchers

•    100% payroll scheme

 

Política de privacidad