Senior Consultant (Hadoop / Bigdata) - System Architect
- Full-time
Company Description
Hortonworks is an industry leading innovator that creates, distributes and supports enterprise-ready open data platforms and modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. Hortonworks is focused on driving innovation in open source communities such as Apache Hadoop, NiFi, and Spark. Along with its 1,600+ partners, Hortonworks provides the expertise, training, and services that allow customers to unlock transformational value for their organizations across any line of business.
Job Description
- Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements
- Analyse complex distributed production deployments, and make recommendations to optimize performance
- Able to document and present complex architectures for the customer’s technical teams
- Work closely with Hortonworks’ teams at all levels to help ensure the success of project consulting engagements with customer
- Deploy, augment, upgrade and operate large Hadoop clusters
- Write and produce technical documentation, knowledge-base articles
- Keep current with the Hadoop Big Data ecosystem technologies
Qualifications
- Overall 8+ years experience
- Experience implementing data transformation and processing solutions using Apache PIG
- Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
- Experience implementing MapReduce jobs
- Experience setting up multi-node Hadoop clusters
- Experience in systems administration or DevOps experience on one or more open-source operating systems
- Strong understanding with various enterprise security practices and solutions such as LDAP and/or Kerberos
- Strong understanding of network configuration, devices, protocols, speeds and optimizations
- Experience using configuration management tools such as Ansible, Puppet or Chef
- 2+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
- Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
- Experience with NiFi is desired
- Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
- Understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (e.g. jstack, jmap, jconsole), logging and monitoring tools (log4j, JMX)
- Ability to understand and translate customer requirements into technical requirements
- Excellent verbal and written communications
Additional Information
Quick facts about Hortonworks : http://hortonworks.com/about-us/quick-facts/
Products: HDP, HDF, Metron & Cloudbreak, DataPlane: https://hortonworks.com/products/
Our Manifesto: https://hortonworks.com/manifesto/
Future of Data: https://www.youtube.com/watch?v=WFDvi7xW08E
Blog: https://hortonworks.com/news-blogs/
Hortonworks Ecosystem: https://hortonworks.com/ecosystems/
Apache Software Foundation: http://www.apache.org/
Newsketter: https://hortonworks.com/newsletters/