Hadoop Developer / Java Developer

  • Full-time

Company Description

ConsultAdd Inc., voted #6 in Inc. 500 Company, 2015 for IT services,is a global IT training and consulting services company with one goal in mind,develop and deliver the highest quality consultants to our clients and customers. As a consulting company, we help organizations to work smarter and grow faster putting our clients at the cutting edge of technology and miles ahead of their competition.

As part of our corporate readiness program, we offer training and placement services for multiple technical domains. Our technical team is ranked best in the industry and is equally ranked by our previous trainees.


Job Description



Job Summary

This is an opportunity Hadoop Developer. The person hired will perform as a Developer on our Hadoop/Big Data team working on many complex and strategic projects.  Provides expert programming and/or analysis skills for specific systems assignments.
Requirements

Education, Work Experience & Knowledge

The candidate's experience and background should include the following:

Design, implement and deploy custom applications on Hadoop
Troubleshoot production issues within the Hadoop environment
Performance tuning of a Hadoop processes and applications
Proven experience as a Hadoop Developer/Analyst in Business Intelligence and Data management production support space is needed.
Strong communication, technology awareness and capability to interact work with senior technology leaders is a must
Knowledge of Hadoop and the Hadoop ecosystem required - Proven experience within CLOUDERA Hadoop ecosystems (MapReduce, HDFS, YARN, Hive, HBase, Sqoop, Pig, Hue, Spark, etc.)
Good knowledge on Agile Methodology and the Scrum process.
Delivery of high-quality work, on time and with little supervision
Critical Thinking/Analytic abilities


Qualifications

Bachelor in Computer Science, Management Information Systems, or Computer Information Systems or equivalent experience.
Minimum of 5 years of Building Java apps
Minimum of 2 years of building and coding applications using Hadoop components -  HDFS, Hbase, Hive, Sqoop, Flume etc.
Minimum of 2 years of coding  Java MapReduce, Python, Pig programming, Hadoop Streaming, HiveQL
Minimum 4 years understanding of traditional ETL tools & Data Warehousing architecture
Experience in Teradata and other RDBMS is a plus.
Must be proficient in SQL/HiveQL

Additional Information