Tech Lead Data – MS Fabric
- Full-time
- Job Category.: Consulting
- Type of contract: CDI
- Name on Job Ad: SQLI
Company Description
We Elevate. Digitally.
SQLI, a European leader in customer experience and digital transformation, helps major international brands create value through technology and digital innovation. From strategy to the deployment of our clients technological assets, we design and build robust, high-performance architectures and engaging experiences by combining cutting-edge technologies, proven methodologies, and deep technical expertise.
To strengthen our teams (2,200 people across 12 countries), we are recruiting passionate individuals who share our values: Creative Spirit, Commitment, Forward Thinking.
Job Description
As a Data Tech Lead, you will be the technical authority for the company’s analytical data foundation built on Microsoft Fabric.
Your primary mission is to design and enforce high-quality dimensional models following Kimball methodology, ensuring analytical consistency, scalability, and performance across the platform.
You will lead the implementation of robust SQL / T-SQL transformation layers, establish strong CI/CD practices using Azure DevOps, and guarantee optimal Microsoft Fabric capacity utilization through precise workload governance and cost optimization.
You will act as the reference expert for data modeling standards, transformation engineering, and Fabric platform efficiency.
Highly hands-on, you will work on a major consolidation project involving sales and operational data to create a scalable, reliable, and modern data foundation.
You will define technical architecture, lead engineering best practices, and collaborate closely with data source owners. Strong communication and synthesis skills are essential, as you will regularly provide clear, structured updates to management.
Your Key responsibilities :
Dimensional Modeling Leadership :
- Design and enforce enterprise-wide dimensional modeling standards based on Kimball methodology.
- Build and validate star schemas, fact tables, conformed dimensions, and Slowly Changing Dimensions (SCD).
- Ensure analytical consistency across domains (sales, operations, etc.).
- Guarantee semantic clarity and business-aligned modeling structures.
- Conduct systematic model reviews to ensure scalability, maintainability, and performance.
- Act as the internal authority on data warehouse best practices.
Advanced SQL / T-SQL Engineering:
- Develop high-performance SQL / T-SQL transformations on large-scale datasets.
- Optimize complex queries for computing efficiency and scalability.
- Structure transformation layers with clarity, modularity, and maintainability.
- Implement robust data quality checks and reconciliation logic.
- Establish SQL coding standards and performance benchmarking practices.
Python Development Notions:
- Create and maintain a toolbox of python scripts available to IT stakeholders to accelerate data collection.
- Guide and support IT stakeholders in using the toolbox.
Collaboration, Communication & Project Steering:
- Work closely with data source owners (IT, business teams, external partners) to understand systems and secure integrations.
- Clarify requirements, challenge assumptions, and structure technical discussions.
- Provide concise progress summaries, risks, and key decisions to management.
- Coordinate with Data Engineers, Data Analysts and Business Analysts.
Transformation Framework & dbt-core Discipline:
- Design and maintain modular transformation workflows.
- Leverage dbt-core where applicable for version-controlled, testable transformations.
- Define naming conventions, dependency management, and documentation standards.
Microsoft Fabric Platform & Capacity Governance:
- Design and operate Microsoft Fabric Warehouse / Lakehouse environments.
- Define a capacity sizing strategy based on workload characteristics.
- Monitor and optimize Fabric capacity utilization.
- Manage workload distribution across capacities and workspaces.
- Ensure cost-performance balance through proactive pricing governance.
- Implement observability mechanisms to track compute and storage consumption.
CI/CD & Azure DevOps Leadership:
- Design and maintain CI/CD pipelines for data artifacts (models, SQL objects, semantic layers).
- Define branching strategies and release management processes.
- Manage artifact lifecycle across development, test, and production environments.
- Automate deployment processes and enforce quality gates.
- Ensure traceability and reproducibility of platform changes.
Technical Leadership & Engineering Governance:
- Mentor Data Engineers on modeling rigor and SQL optimization techniques.
- Lead architecture and code reviews.
- Define and maintain documentation standards.
- Promote a culture of engineering excellence and continuous improvement.
- Provide technical decision rationales to stakeholders when required.
Qualifications
Mandatory:
- 7+ years of experience in Data Engineering / Data Warehousing.
- Proven expert-level mastery of dimensional modeling using Kimball methodology (star schemas, conformed dimensions, SCD types, fact design).
- Strong advanced SQL / T-SQL skills, including performance tuning on large-scale datasets.
- Hands-on experience designing analytical warehouses on Microsoft Fabric (Warehouse / Lakehouse).
- Deep understanding of Microsoft Fabric Capacity sizing, workload management, and pricing mechanisms.
- Strong practical experience with Azure DevOps, including:
- CI/CD pipeline design.
- Artifact management.
- Branching strategies.
- Automated deployments.
- Demonstrated ability to structure and govern transformation layers with engineering rigor.
- Ability to work autonomously on complex analytical platform initiatives.
Strongly Valued:
- Experience with dbt-core for modular and test-driven transformations.
- Experience implementing cost optimization and FinOps practices on cloud data platforms.
- Experience defining enterprise data modeling standards across multiple domains.
- Exposure to performance benchmarking and capacity planning methodologies.
- Experience leading model and code reviews within data engineering teams.
Profile & Soft Skills:
- Highly analytical and detail-oriented.
- Strong engineering discipline and documentation habits.
- Ability to articulate technical trade-offs clearly.
- Ownership mindset and accountability for platform performance and cost.
- Comfortable acting as the technical reference in a specialized domain.
Additional Information
Why join our team?
- A dynamic and innovative environment, within a team passionate about data technologies.
- The opportunity to work on large-scale projects at the forefront of cloud and Big Data technologies.
- Continuous learning and career growth opportunities to develop your skills.
- A flexible work environment with attractive benefits.
- You can choose your preferred location: Casablanca or Rabat.