Job Title: Databricks Engineer (Insurance Domain)
Location: North Carolina (Remote)
Work Authorization: GC and USC Only
Job Summary:
We are seeking an experienced Databricks Engineer with a strong background in the insurance industry to join our team. The ideal candidate will have a proven track record in building managing and optimizing scalable data pipelines using Databricks and an indepth understanding of insurance data and processes. This role will play a critical part in delivering data solutions that support our clients insurance operations.
Key Responsibilities:
- Design develop and maintain scalable data pipelines using Databricks and Apache Spark to support analytics reporting and insurancerelated business processes.
- Collaborate with insurance industry stakeholders to gather requirements and translate them into technical solutions.
- Build and optimize largescale batch and realtime data processing workflows ensuring high performance and reliability.
- Integrate Databricks with cloudbased data services (Azure AWS or GCP) to build efficient solutions.
- Implement best practices for data modeling transformation and architecture for complex insurance use cases (e.g. policy lifecycle claims processing underwriting).
- Perform indepth data analysis and reporting using Databricks to generate actionable insights for insurance teams.
- Work with data scientists to enable machine learning models for predictive analytics in insurance (e.g. fraud detection risk assessment).
- Ensure compliance with regulations like GDPR HIPAA and PCI when handling sensitive insurance data.
- Monitor and troubleshoot performance data quality and system reliability across data pipelines and applications.
Required Qualifications:
- Bachelors degree in Computer Science Data Engineering or a related field (or equivalent experience).
- 3 years of experience working with Databricks and Apache Spark in a production environment.
- Strong understanding of the insurance industry including familiarity with data from claims policies underwriting and customer behavior.
- Proficiency in Python SQL and Scala for data manipulation and processing.
- Experience with cloud platforms (Azure AWS GCP) and their integration with Databricks.
- Knowledge of data warehousing ETL pipelines and big data technologies.
- Familiarity with insurance regulations and data compliance requirements.
- Strong analytical and problemsolving skills.
Preferred Qualifications:
- Experience with Delta Lake and other advanced Databricks features.
- Understanding of machine learning frameworks applied in the insurance domain (e.g. risk models actuarial models).
- Certifications in Databricks or relevant cloud platforms.
- Handson experience with data visualization tools like Power BI Tableau or Databricks visualizations.
Soft Skills:
- Excellent communication skills to collaborate effectively with both technical and nontechnical stakeholders in the insurance industry.
- Teamoriented with strong problemsolving abilities.
- High attention to detail ensuring data accuracy and security.