Senior Database Administrator

  • Full-time
  • Business Unit (Internal): Product & Technology

Company Description

IntegriChain is the data and application backbone for market access departments of Life Sciences manufacturers. We deliver the data, the applications, and the business process infrastructure for patient access and therapy commercialization. More than 250 manufacturers rely on our ICyte Platform to orchestrate their commercial and government payer contracting, patient services, and distribution channels. ICyte is the first and only platform that unites the financial, operational, and commercial data sets required to support therapy access in the era of specialty and precision medicine. With ICyte, Life Sciences innovators can digitalize their market access operations, freeing up resources to focus on more data-driven decision support.  With ICyte, Life Sciences innovators are digitalizing labor-intensive processes – freeing up their best talent to identify and resolve coverage and availability hurdles and to manage pricing and forecasting complexity.

We are headquartered in Philadelphia, PA (USA), with offices in Ambler, PA (USA); Pune, India; and Medellín, Colombia. For more information, visit www.integrichain.com, or follow us on Twitter @IntegriChain and LinkedIn.

Job Description

Join our DevOps Engineering team as a Senior Database Administrator (DBA) responsible for managing, optimizing, and securing our cloud-based database platforms. This hands-on role focuses on performance, reliability, and automation across AWS RDS (Oracle and PostgreSQL) environments. You’ll collaborate closely with DevOps and Product Engineering to ensure scalable, compliant, and resilient data operations supporting business-critical applications.

Key Responsibilities: 

Modern Data Architecture & Platform Engineering

  • Design, build, and optimize database solutions using Snowflake, PostgreSQL, and Oracle RDS.
  • Design and evolve cloud-native data lakehouse architectures using Snowflake, AWS, and open data formats where appropriate.
  • Implement and manage Medallion Architecture (Bronze / Silver / Gold) patterns to support raw ingestion, curated analytics, and business-ready datasets.
  • Build and optimize hybrid data platforms spanning operational databases (PostgreSQL / RDS) and analytical systems (Snowflake).
  • Develop and maintain semantic layers and analytics models to enable consistent, reusable metrics across BI, analytics, and AI use cases.
  • Engineer efficient data models, ETL/ELT pipelines, and query performance tuning for analytical and transactional workloads.
  • Implement replication, partitioning, and data lifecycle management to enhance scalability and resilience.
  • Manage schema evolution, data versioning, and change management in multienvironment deployments

 Advanced Data Pipelines & Orchestration

  • Engineer highly reliable ELT pipelines using modern tooling (e.g., dbt, cloud-native services, event-driven ingestion).
  • Design pipelines that support batch, micro-batch, and near–real-time processing.
  • Implement data quality checks, schema enforcement, lineage, and observability across pipelines.
  • Optimize performance, cost, and scalability across ingestion, transformation, and consumption layers.

AI-Enabled Data Engineering

  • Apply AI and ML techniques to data architecture and operations, including:
  • Intelligent data quality validation and anomaly detection
  • Automated schema drift detection and impact analysis
  • Query optimization and workload pattern analysis
  • Design data foundations that support ML feature stores, training datasets, and inference pipelines.
  • Collaborate with Data Science teams to ensure data platforms are AI-ready, reproducible, and governed.

Automation, DevOps & Infrastructure as Code

  • Build and manage data infrastructure as code using Terraform and cloud-native services.
  • Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes.
  • Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows.

Security, Governance & Compliance

  • Implement enterprise-grade data governance, including role-based access control, encryption, masking, and auditing.
  • Enforce data contracts, ownership, and lifecycle management across the lakehouse.
  • Partner with Security and Compliance teams to ensure audit readiness and regulatory alignment.
  • Build and manage data infrastructure as code using Terraform and cloud-native services.
  • Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes.
  • Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows.

Qualifications

  • 5+ years of experience in data engineering, database engineering, or data platform development in production environments.
  • Strong hands-on experience with Snowflake, including performance tuning, security, and cost optimization.
  • Deep expertise with PostgreSQL and AWS RDS in cloud-native architectures.
  • Proven experience designing lakehouse or modern data warehouse architectures.
  • Strong understanding of Medallion Architecture, semantic layers, and analytics engineering best practices.
  • Experience building and operating advanced ELT pipelines using modern tooling (e.g., dbt, orchestration frameworks).
  • Proficiency with SQL and Python for data transformation, automation, and tooling.
  • Experience with Terraform and infrastructure-as-code for data platforms.
  • Solid understanding of data governance, observability, and reliability engineering.               

 What Success Looks Like Within the First 90 Days:

  • Fully onboarded and delivering enhancements to Snowflake and RDS environments.
  • Partnering with DevOps and Product Engineering on data infrastructure improvements.
  • Delivering optimized queries, schemas, and automation for key systems.

Ongoing Outcomes:

  • Consistent improvement in data performance, scalability, and reliability.
  • Effective automation of database provisioning and change management.
  • Continuous collaboration across teams to enhance data availability and governance.

Bonus Experience (Nice to Have)

  • Experience with dbt, AWS Glue, Airflow, or similar orchestration tools.
  • Familiarity with feature stores, ML pipelines, or MLOps workflows.
  • Exposure to data observability platforms and cost optimization strategies.
  • Relevant certifications (Snowflake SnowPro, AWS Database Specialty, etc.).

Additional Information

What does IntegriChain have to offer?

  • Mission driven: Work with the purpose of helping to improve patients' lives! 
  • Excellent and affordable medical benefits + non-medical perks including Flexible Paid Time Off (PTO) and much more!
  • Robust Learning & Development opportunities including over 700+ development courses free to all employees

#LI-NS1

IntegriChain is committed to equal treatment and opportunity in all aspects of recruitment, selection, and employment without regard to race, color, religion, national origin, ethnicity, age, sex, marital status, physical or mental disability, gender identity, sexual orientation, veteran or military status, or any other category protected under the law. IntegriChain is an equal opportunity employer; committed to creating a community of inclusion, and an environment free from discrimination, harassment, and retaliation.

Privacy Policy