Data Scientist/Engineer (R/Python a must)

  • Raleigh, NC, USA
  • Contract

Company Description

Atyeti Recognition:

·        Inc. 500 & 5000 Honoree Company for 2012, 2013, 2014, 2015, 2016 and 2017

·        Atyeti Ranks No. 270 on the 2012 Inc. 500 List

·        2012, 2016 and 2017 NJ 50 Fastest Growing Companies

Job Description

Job Description:

Overall Responsibilities:

• R / Python library developer
• Creating “R” libraries to be used as interface between model “R” code and generic data retrieval API
• Python and pyspark package development
• Communicate and direct model development teams on implementation of statistical models
• Write specifications and documentation for “R” interfaces.
• Interface with Quant users and gather feedback on delivered analytical data
• Communicate needed changes to development team
• Work with QA team on test plan reviews and assist in QA testing process with requirements clarifications and questions

Functional Responsibilities:
Specific tasks include but are not limited to:

• Work on one of following initiatives within Global Markets Data Analytics group and other projects
• Liaise with Dev team on defining data structures used in computing modeling
• Liaise with QA team on explaining requirements and assisting with QA testing process
• Support UAT process

Required Skills:
Technical / Analytical Skills:

• Experience with advanced “R” coding including package development
• “R” code optimization and memory management
• Understanding of core R data structures required
• Understanding of Python and PySpark development
• Good understanding of testthat, dplyr and sparklyr packages
• Software development background preferred and experience working in Agile environment
• Comfortable working with UNIX/LINUX environment
• Understanding of GIT and/or SVN processes
• Previous experience with RESTful API framework
• Good understanding of XML and JSON
• Good comfort level with relational databases and SQL
• Experience creating interface specification documents, attribute mapping documents, functional specifications
• Proficiency with MS Excel, MS Word, MS Visio and MS PowerPoint

Nice to have:

• Experience with Docker, Jenkins, Azure and Openshift
• Experience with open source big data technologies (Hadoop, Hive, Impala, MLLIB, Oozie etc.) for large scale data analysis
• Experience with visualization tools like Tableau
• Experience with RShiny
• Knowledge of regression techniques for modeling the relationship between an output variable and several input variables
• Understanding of regulatory guidelines (CCAR, Basel) around Model Risk Management
• Working knowledge of various Fixed Income and Equities products, for example Derivatives, FX and Cash Equities

Essential Skills

• Strong verbal and written communication skills
• Strong analytical and problem solving skills

 

Additional Information

All your information will be kept confidential according to EEO guidelines.

Privacy PolicyImprint