Hadoop Developer
- Full-time
Company Description
IT
Job Description
Role: Hadoop Developer
Location: Riverwoods, IL
Duration: Fulltime
BGV will be done for the selected candidates.
Job Duties:
• The Senior/Lead Hadoop Developer is responsible for designing, developing, testing, tuning and building a large-scale data processing system, for Data Ingestion and Data products that allow the Client to improve quality, velocity and monetization of our data assets for both Operational Applications and Analytical needs.
• Design, develop, validate and deploy the ETL processes
• Must have used HADOOP (PIG, HIVE, SQOOP) on HORTONWORKS Distribution.
• Responsible for the documentation of all Extract, Transform and Load (ETL) processes
• Maintain and enhance ETL code, work with the QA and DBA team to fix performance issues
• Collaborate with the Application team to design and develop required ETL processes, performance tune ETL programs/scripts.
• Work with business partners to develop business rules and business rule execution
• Perform process improvement and re-engineering with an understanding of technical problems and solutions as they relate to the current and future business environment.
• Design and develop innovative solutions for demanding business situations.
• Help drive cross team design / development via technical leadership / mentoring. Work with Offshore team of developers.
• Analyze complex distributed production deployments, and make recommendations to optimize performance
Essential skills
• Minimum 3 years ETL experience with RDBNS and Big Data strongly preferred, may consider experience with Informatica or Datastage as an alternate.
• Minimum 2+ years of experience in creating reports using TABLEAU.
• Proficiency with HORTONWORKS Hadoop distribution components and custom packages
• Proven understanding and related experience with Hadoop, HBase, Hive, Pig, Sqoop, Flume, Hbase and/or Map/Reduce
• Excellent RDBMS (Oracle, SQL Server) knowledge for development using SQL/PL SQL
• Basic UNIX OS and Shell Scripting skills
• 6+ years’ experience in UNIX and Shell Scripting.
• 3+ years’ experience in job scheduling tools like AutoSys.
• 3+ years’ experience in Pig and Hive Queries.
• 3+ years’ experience Hand on experience with Oozie.
• 3+ years’ experience in Importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
• 3+ years’ experience in Importing and exporting the data using Sqoop from HDFS to Relational Database systems/Teradata and vice-versa.
• Must have 2+ Experience in working with Spark for data manipulation, preparation, cleansing.
Please respond with your word resume and requested details:
Full Name :
Work Authorization:
Contact Number :
Email ID :
Skype ID:
Current location:
Willing to relocate :
Salary :
Additional Information
All your information will be kept confidential according to EEO guidelines.