Data Analytics Developer

  • Full-time

Company Description

Cloudnile is "Turning Data into Information" through our innovative Analytics-as-a-Service and Data-as-a-Service solutions. Our mission is to bring Data Democracy to the enterprise. Our services help customers adopt modern cloud databases and data acquisition architecture. The revolution for freeing data from its silos is on, and you will have the front seat.

Job Description

Are you passionate about Big Data and the cloud? Do you want to start on the ground floor of building a large scale distributed system? Cloudnile is looking for Data Analytics Developers who can design, develop and re-engineer data solutions that fully leverage the cloud. 

This is an excellent opportunity to join Cloudnile's world class technical teams, working with some of the best and brightest engineers while also developing your skills and furthering your career within one of the most innovative and progressive technology companies. Our Global Delivery group engages in a wide variety of projects for customers and partners, providing collective experience from across our customer base and we are obsessed about strong success for the Customer. Our team collaborates across the entire organization to bring access to product and service teams, to get the right solution delivered and drive feature innovation based upon customer needs.

Qualifications

  • BA/BS degree or equivalent experience; Computer Science, Math, or Engineering background preferred 
  • Strong verbal and written communication skills, with the ability to work effectively across internal and external organizations
  • Hands-on technical big data and analytics experience
  • Understanding and hands on expertise with cloud computing


PREFERRED QUALIFICATIONS


  • Master's degree in Computer Science or Engineering or similar field
  • Hands on experience with data warehousing and analytics projects
  • Real time streaming technologies and time series with tools such as AWS Kenisis, Spark, Flink, Samza etc.
  • Hadoop Big Data knowledge – Hive metastore; storage partitioning schemes on S3 and HDFS
  • ETL understanding and custom coding – AWS Glue and Data Pipeline, Google Cloud Dataflow and Cloud Dataprep
  • RDBMS skills – SQL, optimization techniques, etc.
  • Scripting/Programming skills – Python, Java, Scala, Go
  • Data warehousing platforms knowledge – AWS Redshift, Snowflake, Teradata, Google BigQuery
  • Security at rest and in transit

Additional Information

All your information will be kept confidential according to EEO guidelines.