Data Architect

  • Full-time

Company Description

About Eurofins

Eurofins Scientific is an international life sciences company, providing a unique range of analytical testing services to clients across multiple industries, to make life and our environment safer, healthier and more sustainable. From the food you eat, to the water you drink, to the medicines you rely on, Eurofins works with the biggest companies in the world to ensure the products they supply are safe, their ingredients are authentic and labelling is accurate. Eurofins believes it is a global leader in food, environmental, pharmaceutical and cosmetics products testing and in agroscience CRO services. It is also one of the global independent market leaders in certain testing and laboratory services for genomics, discovery pharmacology, forensics, CDMO, advanced material sciences and in the support of clinical studies.

In over just 30 years, Eurofins has grown from one laboratory in Nantes, France to over 50,000 staff across a network of more than 900 independent companies in over 50 countries, operating more than 800 laboratories. Performing over 400 million tests every year, Eurofins offers a portfolio of over 200,000 analytical methods to evaluate the safety, identity, composition, authenticity, origin, traceability and purity of biological substances and products, as well as providing innovative clinical diagnostic testing services, as one of the leading global emerging players in specialised clinical diagnostics testing.

 Eurofins is one of the fastest growing listed European companies. Since its IPO on the French stock exchange in 1997, Eurofins’ sales have increased by 35% each year (in compound average) to over EUR 4.5 billion in 2019.

About Eurofins IT Solutions India Pvt Ltd

Eurofins IT Solutions India Pvt Ltd (A CMMI Level 3 Company) is a 100% full owned subsidiary of Eurofins. This young captive centre located in Bangalore was established in 2012 to be the largest IT Solutions group within Eurofins to cater to all the internal IT business needs. The primary focus of IT Solution group will be to develop the next generation LIMS (Lab Information Management system), Customer portals, Ecommerce solutions, ERP/CRM system & B2B platforms for various Eurofins Laboratories and businesses.

EITS India is a Young, Dynamic and Growing organization with lot of career growth prospects. We strongly believe that ‘Our people are our assets’ and we ensure that all our staff are provided with great work environment, good benefits and challenging Global projects to enable a fulfilling career. We are committed to provide enriching experience to our employees.

Job Description

POSITION TITLE: Data Architect          

REPORTING TO: Manager                              

REPORTING LOCATION: Bangalore    

WORKING LOCATION: Bangalore                   

NUMBER OF FTEs UNDER RESPONSIBILITY: NA

SUMMARY OF POSITION AND OBJECTIVES:

We are looking for a Data Architect to join our team for reviewing, analyzing, and evaluating how data is processed, transferred, and stored in our key information management system, make recommendations about the methods and ways in which a company obtains and analyses data to improve quality and the efficiency of data systems. The architect will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams, also responsible for designing, recommending, and guiding best practices. The functions will include designing and guiding the implementation of effective database solutions and models for data management. The architect will examine and identify database structural necessities by evaluating operations and applications. The preparation and management of accurate database design and architecture reports will fall under architect’s remit. This in turn will tie into the assessment of database implementation procedures concerning specified regulations.

POSITION & OBJECTIVES:                    

Job description:

  • Performing a detailed analysis of data management requirements across all systems, platforms and applications
  • Defining non-functional requirements for each solution, and working with the team to achieve these requirements in an efficient manner
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Develop process and tools to monitor and analyse model performance and data accuracy/ quality
  • Create and maintain an inventory of data assets
  • Develop conceptual and logical data models
  • Develop data management standards and ensure data is properly managed through its full lifecycle and aligned with business needs
  • Develop data migration strategies and designs
  • Create and maintain optimal data pipeline architecture
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Work with data and analytics experts to strive for greater functionality in our data systems

EXPERIENCE REQUIRED:

SKILLS REQUIRED:

  • Large scale design, implementation and operations of relational databases and NoSQL data storage technologies such as SQL Server, Azure SQL, Azure SQL DW, MongoDB, Azure Data Lake Store, CosmosDB.
  • Creation of analytics solutions using Azure Stream Analytics, Azure Analysis Services, Data Lake Analytics, HDInsight, Spark, Databricks.
  • Experience with message queuing, stream processing, and building highly scalable ‘big data’ data stores, and in optimizing ‘big data’ data pipelines, architectures and data sets.
  • Design and configuration of data movement, streaming and transformation technologies such as Azure Data Factory, EventHub, Kafka, Logic Apps.
  • Search technologies such as Apache Solr, Elasticsearch, Azure Search.
  • Strong analytic skills related to working with unstructured datasets.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with data manipulation languages: Python, R, (Power)Shell, etc.
  • Experience with the preparation of data for visualization (e.g. data profiling, data cleansing, volume assessment and partitioning, modeling data, etc.).
  • Experience of working on SSRS and SSIS, good to have SSAS.
  • Experience working with Data Visualization tools esp. Tableau/ Power BI.
  • Experience creating joins and joins with custom SQL blending data from different data sources using Tableau Desktop/ Power BI Desktop.
  • Experience using advanced calculations using Tableau Desktop (Aggregate, Date, Logical, String, Table, LOD Expressions).
  • Experience in Data Migration, Data Governance, Metadata Management, and Master Data Management
  • Experience in cost/benefit analysis.
  • Experience in protecting data from unauthorized access and data corruption throughout its lifecycle. It includes data encryption, hashing, tokenization, and key management practices that protect data across all applications and platforms.
  • Good understanding of hybrid infrastructure offerings: Consulting, transformation, consolidation, migration, server virtualization, storage consolidation, network and security, Disaster recovery processes

ADDITIONAL SKILLS REQUIRED:

  • Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls etc.
  • Application of AI, Cognitive and Data Science technologies such as Azure Machine Learning, Cognitive Services, Text Analytics API, Face API, Computer Vision API, Bot Service, Azure Notebooks, Anaconda, Jupyter, TensorFlow, R, Neural Networks, NLP, NLG etc.
  • IoT technologies such as IoT Hub, IoT Edge, Event Grid
  • Knowledge of dashboard visualization development best practices.
  • Experience in working in a fast-paced, dynamic and Agile development lifecycle
  • Proven experience in estimating work and break down implementations into tangible modules.

EXPERIENCE

  • Candidate with 12+ years of experience in a Data Engineer role
  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Candidate should be a technical hands-on person with proven experience.
  • Understanding of Product Development Lifecycle and Lean Agile Scrum Methodologies
  • Excellent Communication, Interpersonal and Presentation skills.
  • Fluent written and oral English is essential.

Methodology we have in place and expect to be used:

 

  • Scaled Agile, Lean, Kanban, Zero Defect development method
  • Daily Stand-ups with other developers directly involved
  • Scrum of Scrums, Innovation Sprints
  • Continuous integration
  • Automatic Build and Deployments
  • Automated Unit & Functional Testing
  • Follow Development guidelines and coding style
  • SonarQube based Static Code Analysis

 

Qualifications

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
Privacy Policy