Technical Data Architect (DWH/Bigdata)

  • Contract
  • Contract type: Fixed-term contract

Company Description

We are entrepreneurs in disruptive technology, at Devoteam, we deliver innovative technology consulting for business. Digital Transformakers, we are 7,000+ professionals across EMEA dedicated to ensuring our clients win their digital battle. We improve business performance making their companies truly digital. We advise our clients and build IT infrastructure for digital, making sure people are along for the ride.

Devoteam Middle East, we are the region's leading consulting firm delivering innovative Business & Technology Consulting & Solutions. Our 14 years of existence anchored in technologies that enables our clients business to flourish, accordingly our focus is to help our clients' win the digital battle, adapted to our clients’ business challenges, with a unique consideration to its impact on their systems and structures.

To know more about us, please visit: www.devoteam.com

Job Description

You will be responsible for designing and optimizing big data and data warehouse architecture, as well as optimizing data flow and pipelines for cross functional teams.  You are a technical guru when it comes to selecting the right tools for implementing data ingestion, processing, and storage. Security, performance, scalability, availability, accessibility, and maintainability are your top priorities when designing data solutions. You have a deep, broad, and hands-on experience in the various technologies from Hadoop ecosystem, NoSQL, RDBMs, ingestion, and processing

Qualifications

  • 9-12 years of experience in data warehousing and big data projects.

  • Deep and broad experiences in the Hadoop ecosystem, including HDFS, MapReduce, Hive, HBase, Impala, Kudu, Solr, etc..

  • Hands-on experience in multiple NoSQL databases like Cassandra, MongoDB, Neo4j, ElasticSearch, and ELK Stack

  • Experience with stream-processing systems: Storm, Spark-Streaming, Flink, etc.

  • Experience with a real-time messaging platform like KAFKA, Kinesis, etc.

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases, including distributed relational databases like SingleStore and Vitess

  • Experience in building and optimizing ‘big data’ data pipelines

  • Strong analytic skills related to working with unstructured datasets.

  • Experience in object stores like MINIO and ceph

  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.

  • Proven record of building highly available and always-on data platforms

  • Linux shell scripting

  • languages: Python, Java, Scala, etc.

  • Fluent in English & Arabic

Additional Information

Certifications

  • Cloudera Certified Data Engineer (CCP)

  • IBM Certified Data Engineer – Big Data

  • IBM Certified Data Architect - Big Data

  • MCSA: SQL Server 2012/2014

  • AWS Certified Big Data - Specialty

Privacy Policy