Big Data Architect

  • Contract

Company Description

Jobsbridge

Job Description

Skill SDLC process, Hadoop, Map Reduce, Pig, Hive

Location San Ramon, CA

Total Experience 10 yrs.

Max Salary $ DOE Per Hour

Employment Type Contract Jobs (Temp/Consulting)

Job Duration 12Months+

Domain Any

Description:


“ LOCAL CANDIDATE PREFERRED “ 

Good understanding of SDLC process and experience working in agile environment. 

Minimum 10 years of experience in working on Data warehouse and integration solutions. 

Experience in Data Analytics with Big Data 

Good working Experience on Data Integration, warehousing concepts - Dimensional and Relational models, ETL tools, and Reporting. 

Good Experience integrating multiple Big Data solutions and legacy database systems 

Experience processing large amounts of structured and unstructured data. 

Expert level knowledge of Hadoop ecosystem components – Hadoop, Map Reduce, Pig, Hive, Solr, Elastic search, Spark, Kafka, Storm, Falcon, Oozie, Hawq, Gemfire XD etc. 

Expert level knowledge of one or more NoSQL databases - HBase, Cassandra, MongoDB 

Advanced skills using one or more scripting languages (e.g. python, UNIX shell scripts) 

Ability to quickly understand business problems, find patterns and insights. 

Ability to quickly learn new technologies and work effectively in a very dynamic environment. 

Proven ability to build, manage and foster a team-oriented environment 

Proven ability to work creatively and analytically in a problem-solving environment 

Excellent communication (written and oral) and interpersonal skills 

Excellent leadership and management skills

Qualifications

SDLC process, Hadoop, Map Reduce, Pig, Hive

Additional Information

Multiple Openings