DevOps Big Data Engineer (Data Lake team)
- Bucharest, Romania
Are You Looking For a Fun Place to Work?
Join The Game!
As part of the Data Lake team, you will work on Big Data projects that process large volumes of data either in batch or real time, on premise and in the cloud.
We are in need with an analytical person with a good technical background, and at least 2 years experience in Linux/Unix Administration. Previous experience in an Infrastructure or DevOps team, with operational tasks besides development and implementation, is highly desirable.
Quality comes first: the perfect place for you to make a difference!
Daily challenges include:
- Work with development teams on the core infrastructure needs, standards and deployment scenarios.
- Sustain the existent core infrastructure and help in integrating new technologies, while ensuring system stability.
- Collaborate with fellow ranking administrators on setting up the infrastructure and CI/CD pipelines.
- Debug performance or deployment issues.
- Work with data scientists and other data analytics teams.
- Ensure proper documentation and traceability by improving deployment policies and procedures.
- Help maintain the core set of Ansible playbooks required to keep the systems properly configured and always up.
- Participate in the on-call rotation.
- A working understanding of coding and scripting (Bash, Python).
- Experience with automation/configuration management systems (Puppet, Ansible, Salt, Chef).
- Knowledge of container technologies (Docker).
- Ability to use a wide variety of open source technologies and cloud services.
- Working experience with RDBMS (e.g. MySQL, PostgreSQL).
- Knowledge of best practices and IT operations in an always-up, always available service.
- Ability to understand abstract requirements, define implementation steps and deliver results keeping the client’s needs in focus.
- High curiosity and self-driven nature, attention to details.
- Good team player and communication skills.
- Fast learner and willing to improve himself/herself daily.
- Self-motivated, resourceful person, able to work independently and interdependently and to manage priorities.
- Fluent in English.
- Knowledge of ELK Stack (Elasticsearch, Logstash, Kibana).
- Experience working with NoSQL databases.
- Experience with Amazon Web Services or other cloud providers.
- Familiarity with industry standard Big Data technologies from the Apache Hadoop environment.
- An interest in data science related infrastructure.
Why join Gameloft?
- You want to work with talented people who are industry pioneers.
- You want to join a global company and meet great people around the world.
- Casual & friendly working environment with opportunities to impact the company with your ideas and involvement
- Or, just because you're looking for a fun place to work!
Want to know more? Please visit our Website: www.gameloft.com