Data Engineer - REF4568Y

  • Full-time
  • Company: Deutsche Telekom ITTC Hungary Kft.

Company Description

As Hungary’s most attractive employer in 2025 (according to Randstad’s representative survey), Deutsche Telekom IT Solutions is a subsidiary of the Deutsche Telekom Group. The company provides a wide portfolio of IT and telecommunications services with more than 5300 employees. We have hundreds of large customers, corporations in Germany and in other European countries.

DT-ITS recieved the Best in Educational Cooperation award from HIPA in 2019, acknowledged as the the Most Ethical Multinational Company in 2019. The company continuously develops its four sites in Budapest, Debrecen, Pécs and Szeged and is looking for skilled IT professionals to join its team.

 

Job Description

This position offers the opportunity to contribute to innovative data solutions in an agile and highly professional environment. You will work closely with an international team and play a key role in the further development of modern Data & Analytics products. Your expertise will support the successful transformation towards cloud-based systems and efficient data processes.
To strengthen our Value Stream DFS, we are looking for a Data Engineer IT in the area of Data & Analytics. As a Data Engineer, you will develop and maintain software solutions for complex products and services across the entire development process, using modern technologies for Data Analytics in Public Clouds.

Main Tasks

  • Responsibility for the (further) development of software in the Big Data environment
  • Development and optimization of data pipelines (batch and streaming processes)
  • Design and implementation of scalable, production-ready data products
  • Optimization of PySpark code for high performance in distributed data processing
  • Ensuring code quality through clean, tested code and promoting CI/CD practices
  • Working in agile teams (Scrum, Kanban) with continuous improvement of processes and tools
  • Applying DevOps principles for seamless integration of development and operations
  • Using and further developing cloud-based solutions with a focus on Google Cloud Platform (GCP)

Qualifications

  • Bachelor’s or Master’s in Computer Science, Data Science, or related field
  • Several years of Big Data engineering experience
  • Strong SQL / T-SQL skills (MS SQL Server a plus)
  • Expertise with Hadoop ecosystem (PySpark, Airflow, HDFS, Elasticsearch)
  • Experience with ETL (SSIS), data modeling, warehousing, and BI (SSAS)
  • Proficiency in Python; C# and BigQuery are advantageous
  • Solid knowledge of CI/CD, automation, and agile/DevOps practices
  • Hands-on experience with cloud platforms, especially GCP
  • Proven track record in distributed data processing and scalable architectures
  • Strong problem-solving, adaptability, and communication skills
  • Fluent English (German is a plus)

Additional Information

* Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation.

Privacy PolicyImprint