Senior Data Quality Engineer - TV Insight
- Full-time
- Travel: 0
- Organization: TV-Insight GmbH
- Job type: Permanent
Company Description
As Senior Data Quality Engineer, you take full co-ownership of TV-Insight’s data infrastructure, ensuring that all data relevant for measurement, extrapolation, ad targeting, and reporting is accurate, timely, and efficiently processed.
You’ll design, operate, and continuously improve robust and scalable data pipelines, playing a central role in the evolution of the TV-Insight technology ecosystem. Working closely with the Technology Division and data science teams, you’ll help shape a data platform that enables reliable insights and self-service analytics across the organization.
Job Description
OWNERSHIP OF DATA INFRASTRUCTURE AND PIPELINES
You’ll lead the operation, monitoring, and continuous improvement of data pipelines across all data domains. You guarantee data availability and correctness within production systems and implement robust scheduling, monitoring, and alerting solutions for all data workflows.
AUTOMATION & PROCESS OPTIMIZATION
You’ll automate ETL workflows and manual maintenance processes, developing CI/CD-based automation for data pipelines and deployments. You introduce standardized data quality checks and reconciliation frameworks to ensure scalable, efficient, and reliable data operations.
DATA QUALITY MANAGEMENT
You define and enforce data validation rules and performance metrics, proactively identifying, analyzing, and resolving data inconsistencies. You drive continuous improvements in data integrity and completeness, ensuring stakeholders can rely on data for critical decisions.
COLLABORATION & CROSS-TEAM ENABLEMENT
You partner with data scientists and analysts to ensure data reliability for modeling, experimentation, and reporting. Where needed, you contribute to internal documentation and knowledge sharing, helping teams understand and effectively use data assets and pipelines.
STRATEGIC DEVELOPMENT OF DATA ECOSYSTEM
You evolve TV-Insight’s data platform towards self-service capabilities and high scalability. You participate in architecture design for future data products and act as a subject-matter expert for pipeline automation, data observability, and best practices in data engineering.
Qualifications
University degree in Computer Science, Software Engineering, Data Science, Data Engineering, Information Systems or a related field
5+ years of experience in data engineering, data quality, or pipeline automation
Proven experience with ETL frameworks (e.g., Airflow, Prefect, dbt)
Strong knowledge of SQL, Python, and data warehouse concepts (e.g., BigQuery, Snowflake)
Experience with CI/CD, Docker, and cloud-based deployment (e.g., GCP, AWS)
Solid understanding of data validation, monitoring, and alerting frameworks
Analytical mindset with strong attention to detail
Strong communication skills and a team-oriented attitude
Self-driven, reliable, and able to manage complex data operations independently
Additional Information
Due to legal reasons we are obliged to disclose the minimum salary according to the collective agreement for this position, which is EUR 2,509 gross per month. However, our attractive compensation package is based on market-oriented salaries and is therefore significantly above the stated minimum salary.
As an employer, we value diversity and support people in developing their potential and strengths, realizing their ideas and seizing opportunities. We believe passionately that employing a diverse workforce is central to our success. We welcome applications from all members of society irrespective of age, skin colour, religion, gender, sexual orientation or origin.