SE/SSE/TC - Azure
- Full-time
Qualifications
The role requires experience in Azure core technologies - Azure Data Lake Storage, Azure Data Lake Analytics, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight, SQL data warehouse.
You Have:
- Minimum 2 years of software development experience
- Bachelor's and/or Master’s degree in computer science
- Strong Consulting skills in data management including data governance, data quality, security, data integration, processing and provisioning
- Led and delivered data management projects in Azure Cloud
- Translated complex analytical requirements into technical design including data models, ETLs and Dashboards / Reports
- Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
- Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
- Successfully delivered large scale data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
- Strong knowledge of continuous integration, static code analysis and test-driven development
- Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
- Must have Excellent analytical and problem-solving skills
- Delivered change management initiatives focused on driving data platforms adoption across the enterprise
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
Location:
Bengaluru, Karnataka, India
Experience:
2 to 8 Years
Skills Required:
Azure Data Lake, Data Factory, SQL database
Roles:
You Will:
- Translate functional requirements into technical design
- Interact with clients and internal stakeholders to understand the data and platform requirements in detai and determine core Azure services needed to fulfill the technical design
- Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
- Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
- Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapase or SQL
- Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
- Automate core activities to minimize the delivery lead times and improve the overall quality
- Optimize platform cost by selecting right platform services and architecting the solution in a cost effective manner
- Deploy Azure DevOps and CI CD processes
- Deploy logging and monitoring across the different integration points for critical alerts
Desirable Skills:
Spark, Python, Kubernetes
Department:
Technology