Senior Data Engineer
- Full-time
Company Description
Devexperts Global
Devexperts works with respected financial institutions, delivering products and tailor-made solutions for retail and brokerage houses, exchanges, and buy-side firms. The company focuses on trading platforms and brokerage automation, complex software development projects, market data products, and IT consulting services.
Job Description
We are looking for a Senior Data Engineer with a Java / Scala / Python background to join the project for a Top-5 US retail broker (by the number of users). Our current project is devoted to trading experience, finance reports, and risk management.
You will join a cross-functional team that excels in getting features done from zero to production.
We expect the Senior Data Engineer to:
1. Develop Data Pipeline:
- Design, develop, and maintain robust data pipelines using Java within AWS infrastructure,
- Implement scalable solutions for data analysis and transformation using Apache Spark and PySpark,
- Utilise Airflow for efficient workflow orchestration in complex data processing tasks,
- Ensure fast and interactive querying capabilities through the use of Presto.
2. Manage Infrastructure:
- Containerise applications using Docker for streamlined deployment and scaling,
- Orchestrate and manage containers effectively with Kubernetes in production environments,
- Implement infrastructure as code using Terraform for provisioning and managing AWS resources.
3. Collaborate and Communicate:
- Collaborate with cross-functional teams to understand data requirements and architect scalable solutions aligned with business goals,
- Ensure data quality and reliability through robust testing methodologies and monitoring solutions,
- Stay updated with emerging technologies and industry trends to continuously enhance the data engineering ecosystem.
Qualifications
Must-have skills:
1. Education and Experience:
- Bachelor's degree in Computer Science, Engineering, or related field,
- Minimum 5 years of hands-on experience in Java / Scala / Python development, emphasising object-oriented principles.
2. Technical Proficiency:
- Proficiency in Apache Spark or PySpark for large-scale data processing,
- Experience with Airflow for workflow orchestration in production environments,
- Familiarity with Docker for containerisation and Kubernetes for container orchestration,
- Experience managing AWS services such as S3, EMR, Glue, Athena, and Redshift,
- Strong background in SQL and relational databases.
3. Communication Skills:
- Excellent English language communication skills, both verbal and written,
- Ability to collaborate effectively with technical and non-technical stakeholders.
Nice-to-have skills:
- Experience with streaming platforms such as Kafka for real-time data processing,
- Knowledge of Terraform for infrastructure as code implementation in AWS environments.
Additional Information
- Paid vacation 20 + 5 days
- Free MultiSport card
- Medical insurance – premium package
- Мodern office space
- Panoramic view of Vitosha mountain
- Gym & billiard in the office
- Parking spot or public transport card
- Mentorship program
- Training, courses, workshops
- Paid pro certifications
- Subscriptions to pro sources
- Participation in conferences
- English courses
- Trading contest within the company
- Tech meetup dxTechTalk
- Speaker's club
- Opportunity to develop your personal brand as a speaker
- Internal referral program
- Remote work / Hybrid mode
- Flexible schedule
- Work & Travel program
- Relocation opportunities