Architect Snowflake Cloudera Mapr Hortonworks

  • Contract

Company Description

PDDN is a provider of end-to-end software solutions and IT consulting Services and software development Company. is headquartered in Fremont, California with clients across the Silicon Valley and other Information technology Hubs in different states. With integrated solutions, software development, technical services, training and staffing support, we help customers achieve their technology goals allowing them to focus on their business.

 

Job Description

Position: Snowflake Architect
Location: San Francisco, CA
Job Type: Full Time / Contract

Responsibilities:
Snowflake Architect will build, create and configure enterprise level Snowflake environments.
The focus will be on choosing optimal solutions Snowflake implementations then maintaining, implementing, monitoring and integrating them with the architecture used across our client.
Build, design, architect and implement high-volume, high-scale data analytics and machine learning Snowflake solutions in the cloud.
Bring new ideas in cloud, big data, and machine learning software development.
Design and develop features, understand customer requirements and meet business goals.
Build high quality and highly reliable software to meet the needs to the largest customers.
Analyze and improve the performance, scalability, and high availability of large scale distributed systems and the query processing engine.

Required Skills:
Must have extensive experience with Snowflake.
Proficient understanding of distributed computing principles.
Management of Hadoop cluster, with all included services.
Ability to solve any ongoing issues with operating the cluster.
Proficiency with Hadoop v2, MapReduce, HDFS.
Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming.
Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala.
Experience with Spark and Scala.
Experience with integration of data from multiple data sources.
Experience with NoSQL databases, such as HBase, Cassandra, MongoDB.
Knowledge of various ETL techniques and frameworks, such as Flume.
Experience with various messaging systems, such as Kafka or RabbitMQ.
Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O.
Good understanding of Lambda Architecture, along with its advantages and drawbacks.
Experience with Cloudera/MapR/Hortonworks.

Additional Information

All your information will be kept confidential according to EEO guidelines.