Data Engineer GCP (m/f/d)
- Full-time
Company Description
T‑Systems is part of the Deutsche Telekom Group, with around 30.000 employees worldwide. We create technology with purpose to generate a positive impact on society. We are looking for curious talent, eager to learn, take on challenges, and contribute ideas that transform our customers’ experience.
We trust people: we offer autonomy, continuous support, and a collaborative environment where you can grow without limits. We are one global team, guided by respect, integrity, and a passion for doing better every day.
Job Description
Project Description:
We are looking for Data Engineer to join our team in DBIZ. The Technik Value Stream teams are responsible for Data Ingest on One Data Entry (ODE), development of the relevant data products on ODE, Operations of the data products on ODE
Activity description and concrete tasks:
- Infrastructure Deployment & Management: Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP) including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing.
- Data Processing & Transformation: Utilize Hadoop cluster with Hive for querying data, and PySpark for data transformations. Implement job orchestration using Airflow.
- Core GCP Services Management: Work extensively with services like Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Compute Engine, and Composer, all managed through Terraform.
- Application Implementation: Develop and implement Python applications for various GCP services.
- CI/CD Pipelines: Integrate and manage GitLab Magenta CI/CD pipelines for automating cloud deployment, testing, and configuration of diverse data pipelines.
- Security & Compliance: Implement comprehensive security measures, manage IAM policies, secrets using Secret Manager, and enforce identity-aware policies.
- Data Integration: Handle integration of data sources from CDI, Datendrehscheibe (FTP servers), TARDIS API´s and Google Cloud Storage (GCS).
- Multi-environment Deployment: Create and deploy workloads across Development (DEV), Testing (TEST), and Production (PROD) environments.
- AI Solutions: Implement AI solutions using Google’s Vertex AI for building and deploying machine learning models.
- Certification Desired: Must be a certified GCP Cloud Architect or Data Engineer.
Qualifications
Skills Required:
- Google Cloud Platform (GCP) knowledge
- Expertise in Terraform for infrastructure management
- Python for application implementation
- Experience with GitLab CI/CD for automation
- Knowledge of network architectures, security implementations, and management of core GCP services
- Proficiency in employing data processing tools like Hive, PySpark, and data orchestration tools like Airflow
- Familiarity with managing and integrating diverse data sources
Additional Information
What do we offer you?
Work environment & flexibility
- International, dynamic and collaborative environment
- T-Social: social initiatives (sports, community, health, ...)
- Hybrid work model (remote/on-site)
- Flexible working hours
- Growth & development
- Customized training: access to Coursera to learn whatever you want, whenever you want
- Weekly language classes (English & German)
- International Mentoring Sessions & Experience Days
Compensation & benefits
- Flexible compensation plan (health insurance, meal vouchers, childcare, transport)
- Telemedicine
- Life and accident insurance
- Social fund
Wellbeing & time off
- 26+ working days of vacation per year
- Free access to specialist services (medical, legal, wellness)
- 100% salary coverage during medical leave