DevOps Engineer

  • Full-time

Company Description

Cloudnile is "Turning Data into Information" through our innovative Analytics-as-a-Service and Data-as-a-Service solutions. Our mission is to bring Data Democracy to the enterprise. Our services help customers adopt modern cloud databases and data acquisition architecture. The revolution for freeing data from its silos is on, and you will have the front seat.

Job Description

Are you passionate about Big Data and the cloud? Do you want to start on the ground floor of building a large scale distributed system? Cloudnile is looking for DevOps Engineers who can design, develop and re-engineer data solutions that fully leverage the cloud. 

This is an excellent opportunity to join Cloudnile's world class technical teams, working with some of the best and brightest engineers while also developing your skills and furthering your career within one of the most innovative and progressive technology companies. Our Global Delivery group engages in a wide variety of projects for customers and partners, providing collective experience from across our customer base and we are obsessed about strong success for the Customer. Our team collaborates across the entire organization to bring access to product and service teams, to get the right solution delivered and drive feature innovation based upon customer needs.

Responsibilities:
  • Influence and create new designs, software, architecture, and methods for deploying large-scale distributed systems 
  • Rethink and implement distributed configuration management in the cloud
  • Automating deployments in a Windows/Linux data center environment
  • Set-up and maintain Continuous Delivery pipelines for application & infrastructure delivery
  • Build resilient services which utilize telemetry and metrics to drive operational excellence
  • Maintain existing code/configuration deployment system
  • Troubleshoot production issues related to software deployment and configuration

Qualifications

  • BS/BCS degree or equivalent experience; Computer Science or Engineering background preferred 
  • Strong verbal and written communication skills, with the ability to work effectively across internal and external organizations
  • Hands-on experience in designing and implementing automation systems for configuration management and code deployment
  • Intimate knowledge of one of ansible, puppet, salt or chef for automating deployments and configuration
  • Experience with Continuous Integration (CI) and Continuous Delivery (CD)
  • Experience working with both private and public clouds specifically AWS (EC2, VPC, S3, ELB, Route 53, CloudFormation)
  • Experience with software virtualization and containerization using docker, mesos, kubernetes, vagrant and likes
  • Experience with scripting languages such as Python, groovy, shell and PowerShell
  • Excellent troubleshooting skills 
  • Understanding of Java and OO programming


PREFERRED QUALIFICATIONS

  • Master's degree in Computer Science or Engineering or similar field
  • Real time streaming technologies and time series with tools such as AWS Kenisis, Spark, Flink, Samza etc.
  • Hadoop Big Data knowledge – Hive metastore; storage partitioning schemes on S3 and HDFS
  • ETL understanding and custom coding – AWS Glue and Data Pipeline, Google Cloud Dataflow and Cloud Dataprep
  • RDBMS skills – SQL, optimization techniques, etc.
  • Data warehousing platforms knowledge – AWS Redshift, Snowflake, Teradata, Google BigQuery
  • Security at rest and in transit

Additional Information

All your information will be kept confidential according to EEO guidelines.