Our focus is to help high-growth healthcare and life science companies leverage AI to solve today’s hardest challenges via strategic consulting and the development of custom applications to improve care, enhance discoveries, and enable data-driven decisions.
Our team is composed of creative and results-focused individuals who excel at solving real-world problems. Our diverse backgrounds bring technology and expertise from various disciplines including neuroscience, physics, engineering, computational biology, genomics, mathematics, and computer science.
Our culture is vibrant, connected, rooted in our core values statement that: Our work matters. Our clients are partners. Our work is our reputation. We own our choices. We are always learning. We support and challenge each other.
This role is eligible for flexible hours and remote work.
This role is ideal for results-focused individuals interested in developing solutions that support and augment data science initiatives across healthcare and life science companies using a wide variety of tools and technologies.
MDS Engineers collaborate with Data Scientists and other developers to build and deploy internal products and client solutions across multiple domains and projects. Our team is pragmatic, curious, and works to consistently build clean solutions in an environment where strong communication and collaboration are cornerstones.
We generally require applicants to have 2+ years of prior engineering experience on data-intensive projects to qualify for this role.
Tasks & responsibilities:
- Developing software to support data science initiatives
- Routinely refactoring codebases to mitigate technical debt
- Deploying container jobs onto Kubernetes clusters
- Implementing efficient data pipelines
- Designing and implementing data warehousing solutions
- Deploying cloud infrastructure using Infrastructure as Code
- Experience deploying scalable applications onto cloud infrastructure (GCP, AWS, Azure)
- Experience with containerization and orchestration tools (Docker, Kubernetes, Argo).
- Experience designing, building, and managing databases and data warehouses in the cloud (Postgres, Snowflake, Redshift, etc)
Nice to Have
- Project management & consulting experience
- Managed a cross-functional team in the delivery of Data and AI-driven applications to a customer
- Experience productionizing and deploying machine learning systems deployed on cloud infrastructure
- Understanding of Information Security best practices for applications and company requirements
- Experience deploying containers and orchestrating them at scale (Docker, Kubernetes) in a cloud environment (GKE, EKS, AKS)
- Exposure to biological/clinical data (omics, imaging, biosensors, etc.)
- Exposure to PII and HIPAA requirements and compliance for data storage and access patterns
- Ability to design cloud, on-premises, or hybrid solutions given client requirements, gaps and in-depth systems analysis
- Relevant certifications for GCP, AWS, and/or Linux Foundation
Tools We Love
- Workflows/Pipelines: Kubernetes, Argo Workflows/Events, MLFlow
- Data Stores / Databases: Snowflake, ArangoDB, BigQuery, Cloud SQL, DynamoDB, S3
- Engineer - 1+ years experience/relative skills
- Senior Engineer - 3+ years experience/relative skills