Big Data Solutions Architect and Client: Stater Bros and Rate: $130/hr and Location: San Berdanhino, CA and Duration: 6+ months

  • Full-time

Company Description

nFolks

Job Description

Big Data Solutions Architect

Qualifications

Bachelors

Additional Information

Job Description

Local candidate preferred or ability to relocate

 

General Experience

-10+ years of experience building solution designs and architectures for enterprise Big Data Solutions

-3+ years of experience in technology consulting preferred

-CPG/Retail domains is preferred

-Working with all organizational levels to understand requirements and provide thought leadership related to Big Data Solutions

-Ability to facilitate, guide, and influence decision makers and stakeholders towards the proper IT architecture

-Ability to create presentation materials and simplify complex ideas

-Ability to present technology architecture and solution overviews to executive audiences

- Drive innovations through hands on proof-of-concept's and prototypes to help illustrate approaches to technology and business problems

  

Functional Experience

-Full Software Development Life Cycle (SDLC) of the Big Data Solutions

-Experience with data integration and streaming technologies for EDW and Hadoop

-Data modeling and database design

-Data warehousing and Business Intelligence systems and tools

-Open source Hadoop stack

-Administration, configuration, monitoring, and performance tuning of Hadoop/Distributed platforms

-Big Data and real time analytics platforms

-ETL for Big Data

- Migration of Legacy data warehouse to Data Lake

- Develop guidelines, standards, and processes to ensure the highest data quality and integrity

- Understanding of CI/CD in relation to Big Data platform.

- Understanding of Containers technologies is a plus

-Knowledge/experience of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce, Azure)

 

Combination of technical skills

-Hadoop (HDFS, MapReduce, Hive, Hbase, Pig, Spark)

-Cloudera, Hortonworks, MapR,

-NoSQL (Cassandra, MongoDB, Hbase)

- Git, Nexus

- Enterprise scheduler

- Kafka, Flume, Strom

-Appliances (Teradata, Netezza)

-Languages (Java, Linux, Apache, Perl/Python/PHP)

-Data Virtualization

 

Education

-Bachelor's degree in Computer Science or a related field preferred

-Master's degree in a related field preferred 

Job Description

Local candidate preferred or ability to relocate

 

General Experience

-10+ years of experience building solution designs and architectures for enterprise Big Data Solutions

-3+ years of experience in technology consulting preferred

-CPG/Retail domains is preferred

-Working with all organizational levels to understand requirements and provide thought leadership related to Big Data Solutions

-Ability to facilitate, guide, and influence decision makers and stakeholders towards the proper IT architecture

-Ability to create presentation materials and simplify complex ideas

-Ability to present technology architecture and solution overviews to executive audiences

- Drive innovations through hands on proof-of-concept's and prototypes to help illustrate approaches to technology and business problems

  

Functional Experience

-Full Software Development Life Cycle (SDLC) of the Big Data Solutions

-Experience with data integration and streaming technologies for EDW and Hadoop

-Data modeling and database design

-Data warehousing and Business Intelligence systems and tools

-Open source Hadoop stack

-Administration, configuration, monitoring, and performance tuning of Hadoop/Distributed platforms

-Big Data and real time analytics platforms

-ETL for Big Data

- Migration of Legacy data warehouse to Data Lake

- Develop guidelines, standards, and processes to ensure the highest data quality and integrity

- Understanding of CI/CD in relation to Big Data platform.

- Understanding of Containers technologies is a plus

-Knowledge/experience of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce, Azure)

 

Combination of technical skills

-Hadoop (HDFS, MapReduce, Hive, Hbase, Pig, Spark)

-Cloudera, Hortonworks, MapR,

-NoSQL (Cassandra, MongoDB, Hbase)

- Git, Nexus

- Enterprise scheduler

- Kafka, Flume, Strom

-Appliances (Teradata, Netezza)

-Languages (Java, Linux, Apache, Perl/Python/PHP)

-Data Virtualization

 

Education

-Bachelor's degree in Computer Science or a related field preferred

-Master's degree in a related field preferred