Senior Data Engineer (Interior Design)

  • Full-time

Company Description

We are looking for a Senior Data Engineer to join our team and work on an innovative interior decor visualization platform.

You can become part of a team of over 300 experts in the AdTech domain by joining our Stellar team. What is Stellar? It’s a community, a network, and a dedicated business unit within Sigma Software that specializes in advertising technology solutions. Over the years, we’ve had the opportunity to work with some of the largest AdTech companies in the world. We’ve helped tons of AdTech startups transform from ideas into thriving businesses. Since 2008, we’ve been working with advertising technology businesses to help them envision, build, and support their technology.

PROJECT

The project is an advanced interior decor visualization platform that allows customers to preview furniture and decor items in their home environment before purchasing. Imagine being in a store and instantly seeing how a sofa, carpet, or other items fit into your interior design — this solution makes it possible. The system integrates high-performance data pipelines, machine learning models, and visualization tools to deliver a seamless user experience.

Job Description

  • Design, build, and optimize high-volume, high-performance ELT pipelines for centralized data warehousing
  • Collaborate with Product Managers, ML Engineers, Data Scientists, and DevOps to define and enforce a reliable, scalable, and secure data platform architecture
  • Ensure adherence to data warehousing standards, data quality best practices, and metadata management processes
  • Take ownership of key data warehouse components and drive improvements in performance and reliability
  • Conduct architecture reviews and participate in system design discussions to provide technical leadership
  • Mentor and guide junior engineers, fostering a culture of quality and continuous learning

Qualifications

  • 5+ years of professional experience in data engineering, building scalable data systems and pipelines
  • Expert proficiency in SQL
  • Strong programming skills in Python
  • Proven experience in designing, building, and managing large-scale data lakes and warehouses
  • Solid computer science fundamentals (data structures, algorithms, distributed systems)
  • Deep understanding of distributed system architecture with a focus on data availability, reliability, and performance
  • Upper-Intermediate or higher English level

WOULD BE A PLUS

  • Experience with cloud platforms such as AWS, GCP, or Azure
  • Knowledge of Spark, Kafka, Airflow, or dbt
  • Familiarity with BI tools like Tableau or Power BI
  • Understanding of CI/CD practices for data pipelines
  • Exposure to machine learning model deployment and monitoring

Additional Information

PERSONAL PROFILE

  • Analytical thinker with strong problem-solving skills
  • Detail-oriented and committed to delivering high-quality results
  • Collaborative team player with excellent communication skills
  • Adaptable to changing priorities and requirements
  • Proactive and self-motivated with a leadership mindset