Sr. PySpark developer - Tieto Tech Consulting (m/f/d)

  • Full-time
  • Remote Type: Hybrid
  • Job Area: Application and Product Development
  • Business Unit: Tech Consulting

Company Description

We are seeking an experienced Senior PySpark Developer with 8+ years of expertise in data engineering, big data technologies, and ETL development. The ideal candidate will have strong hands-on experience in PySpark, deep understanding of distributed data processing, and proven experience working with Informatica or SSIS, along with Cloudera Hadoop ecosystem.

Key Responsibilities

Design and architect scalable, high-performance data pipelines using PySpark.
Lead development and optimization of big data solutions within the Cloudera Hadoop ecosystem (CDH/CDP).
Analyze and understand ETL workflows using Informatica or SSIS.
Handle large-scale data ingestion, transformation, and processing from diverse data sources.
Optimize Spark jobs for performance, memory management, and cost efficiency.
Provide technical leadership, mentoring junior developers, and enforcing best practices.
Collaborate with cross-functional teams including data architects, analysts, and business stakeholders.
Ensure data quality, governance, and security standards are followed.
Troubleshoot complex production issues and provide long-term solutions.
Drive improvements in data engineering processes, automation, and CI/CD practices.

Job Description

Required Skills & Qualifications

8+ years of experience in data engineering, big data, or related roles.
Strong hands-on expertise in PySpark and Apache Spark.
Extensive experience with Cloudera distribution (CDH/CDP).
Solid ETL development experience using Informatica or Microsoft SSIS.
Advanced proficiency in Python and SQL.
Strong understanding of Hadoop ecosystem tools (HDFS, Hive, Impala, HBase).
Experience with workflow orchestration tools (e.g., Airflow).
Strong knowledge of data warehousing concepts and data modeling.
Experience working in Unix/Linux environments.

Preferred Skills

Experience with cloud platforms (AWS, Azure, or GCP).
Exposure to real-time data processing tools (Kafka, Spark Streaming).
Knowledge of DevOps, CI/CD pipelines, and containerization (Docker/Kubernetes).
Experience with version control tools like Git.

 

Additional Information

At Tieto, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation. Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.

Diversity, equity and inclusion (tietoevry.com)

Privacy Notice