Lead Data Engineer – Palantir & PySpark
- Contract
Company Description
Arthur Grand Technologies (www.arthurgrand.com) is in the business of providing staffing and technology consulting services. We have doubled our revenue year over year for the past 5 years. This speaks to the long-lasting relationship and customer satisfaction that we have built in this short span of time. Our company is managed by a team of professionals who worked for big 5 consulting firms for 20+ years.
We are a minority owned staff augmentation and technology consulting company
To keep our valued employees, we need to keep them engaged in challenging, interesting work, offer market-relevant benefits and provide continued opportunities for professional growth.
Job Description
Job Title: Lead Data Engineer – Palantir & PySpark
Location: Remote
Job Summary:
We are seeking a highly skilled Data Engineer with hands-on experience in Palantir (Foundry preferred), PySpark, and exposure to reinsurance or insurance data environments. The ideal candidate will play a key role in building scalable data pipelines, optimizing ETL workflows, and enabling advanced analytics and reporting capabilities. This role requires a strong technical foundation in data engineering combined with an understanding of the reinsurance business domain.
Key Responsibilities:
- Design, develop, and maintain data pipelines and ETL workflows using PySpark, SQL, and Palantir Foundry.
- Collaborate with data architects, business analysts, and actuarial teams to understand reinsurance data models and transform complex datasets into usable formats.
- Build and optimize data ingestion, transformation, and validation processes to support analytical and reporting use cases.
- Work within the Palantir Foundry platform to design robust workflows, manage datasets, and ensure efficient data lineage and governance.
- Ensure data security, compliance, and governance in line with industry and client standards.
- Identify opportunities for automation and process improvement across data systems and integrations.
Required Skills & Qualifications:
- 6–10 years of overall experience in data engineering roles.
- Strong hands-on expertise in PySpark (dataframes, RDDs, performance optimization).
- Proven experience working with Palantir Foundry or similar data integration platforms.
- Good understanding of reinsurance including exposure, claims, and policy data structures.
- Proficiency in SQL, Python, and working with large datasets in distributed environments.
- Experience with cloud platforms (AWS, Azure, or GCP) and related data services (e.g., S3, Snowflake, Databricks).
- Knowledge of data modeling, metadata management, and data governance frameworks.
- Familiarity with CI/CD pipelines, version control (Git), and Agile delivery methodologies.
Preferred Skills:
- Experience with data warehousing and reporting modernization projects in the reinsurance domain.
- Exposure to Palantir ontology design and data operationalization.
- Working knowledge of APIs, REST services, and event-driven architecture.
- Understanding of actuarial data flows, submission processes, and underwriting analytics is a plus.
Thanks,
Afrah Faiza
Arthur Grand Technologies Inc
Arthur Grand Technologies is an Equal Opportunity Employer (including disability/vets)
Additional Information
All your information will be kept confidential according to EEO guidelines.