Data Engineer with Cloud data migration using Snowflake, Python / Scala, Spark

  • Contract

Job Description

Position: Data Engineer

Duration: Long Term

Location: Remote

No. of positions: 1

Mandatory Skills: Strong experience in Cloud data migration projects using Snowflake with stored procedures. Scripting and Programming background using Python/Scala and ingestion using Spark

Responsibilities & Requirements

Experience with implementing Data warehouse, Data lakes in the cloud.

Working experience with Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts.

Good Knowledge of Snowflake concepts like Streams, Tasks, Snowpipes, Zero copy clone, time travel, query profiling,  RBAC controls, virtual warehouse sizing  and experience using these features.

Proven track record in designing complex scalable pipelines using Cloud supported ELT Tools like Streamsets/Informatica etc along with streaming.

Experienced in implementing designing highly parallelized data ingestion and transformation jobs in Spark.

Good working experience in any programming language like python, Scala etc.

Understanding of AWS Cloud infrastructure concepts and its services and integrations to Snowflake.

Experience in Data Migration from RDBMS to Snowflake cloud data warehouse.

Ability to create & implement data engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, documentation, build processes, automated testing, and operations.

Insurance domain knowledge is a plus.

Must have experience in Agile development projects applying Agile methodologies.

Ability to conduct root cause analysis and recommend performance tuning.

Strong analytical skills to analyze data sets and identify patterns.

Additional Information

All your information will be kept confidential according to EEO guidelines.