Senior Hadoop Engineer, Data Platform

  • Full-time
  • Job Family Group: Technology and Operations

Company Description

As the world's leader in digital payments technology, Visa's mission is to connect the world through the most creative, reliable and secure payment network - enabling individuals, businesses, and economies to thrive. Our advanced global processing network, VisaNet, provides secure and reliable payments around the world, and is capable of handling more than 65,000 transaction messages a second. The company's dedication to innovation drives the rapid growth of connected commerce on any device, and fuels the dream of a cashless future for everyone, everywhere. As the world moves from analog to digital, Visa is applying our brand, products, people, network and scale to reshape the future of commerce.

At Visa, your individuality fits right in. Working here gives you an opportunity to impact the world, invest in your career growth, and be part of an inclusive and diverse workplace. We are a global team of disruptors, trailblazers, innovators and risk-takers who are helping drive economic growth in even the most remote parts of the world, creatively moving the industry forward, and doing meaningful work that brings financial literacy and digital commerce to millions of unbanked and underserved consumers.

You're an Individual. We're the team for you. Together, let's transform the way the world pays.

Job Description

·       Develop Hadoop architecture, HDFS commands & utilities

·       Design & optimize analytical jobs and queries against data in the HDFS/Hive environments

·       Develop bash shell or python scripts, LINUX utilities & LINUX Commands

·       Develop data models that helps in platform analytics & hardening

·       Develop self-healing system at scale

·       Able to help and Guide L1/L2 support engineers to fix day-to-day Operational issues

·       Perform Tuning and Increase Operational efficiency on a continuous basis

·       Build framework to gather all deep system level metrics across platforms centrally as part of the control center 

·       Develop central dashboards for all System, Data, Utilization and availability metrics

·       Build data transformation and processing solutions

·       Build high volume data integrations

 

Required skills:

·       Hadoop (preferably Cloudera or Hortonworks distribution), HDFS, Hive, Impala, Kafka, Spark, Oozie, HBase

·       Strong knowledge on SQL & HQL

·       Strong Linux knowledge and scripting, Python

·       Java, J2EE, Web Applications, Tomcat (or any equivalent App server), Restful Services, JSON,Design Patterns

·       Kerberos, TLS, Senry, data encryptio

Qualifications

·       Minimum 3 years of work experience in developing, maintaining, optimization, issue resolution of Hadoop clusters, supporting Business users

·       Minimum of four-year technical degree in computer science or IT related required

·       Experience in Linux / Unix OS Services, Administration, Shell, awk scripting

·       Experience in building and scalable Hadoop applications 

·       Experience in Core Java, Hadoop (Map Reduce, Hive, Pig, Spark, Kafka, Hbase, HDFS, H-catalog, Zookeeper and OOzie) 

·       Experience in Hadoop security (Kerberos, Knox, TLS)

·       Hands-on Experience in SQL and No SQL Databases (HBASE/Cassandra/Mongo DB) 

·       Experience in building large scale real-world backend and middle-tier systems in Java

·       Experience in tool Integration, automation, configuration management in GIT, Jira platforms 

·       Excellent oral and written communication and presentation skills, analytical and problem solving skills 

·       Self-driven, Ability to work independently and as part of a team with proven track record developing and launching products at scale

·       Develop and enhance platform best practices and educate Visa developers on best practices

Additional Information

All your information will be kept confidential according to EEO guidelines.

Privacy Policy