Job Title: Lead Data/Cloud Engineer
Experience: 7 years of experience in data engineering cloud platforms and API development.
Location Remote
Whats in it for you
As a Lead Data/Cloud Engineer you will be part of an Agile team working on healthcare applications developing and implementing new features while adhering to the best coding and development standards. This role is central to building scalable robust and secure data infrastructure in a healthcare interoperability platform.
Mandatory Skills:
- Expertise in managing multicloud infrastructures (50% GCP 25% Azure 25% onpremises).
- Strong knowledge of container orchestration (Kubernetes Docker).
- Experience with StreamSets Azure Data Factory and Databricks for ETL processes.
- Proficiency in API development with .NET and familiarity with Python.
- API Gateway management (e.g. Apigee Datapower).
- Handson experience with Kafka and StreamSets for event streaming.
- Expertise in databases such as Snowflake MongoDB and FHIR servers.
- Understanding of Firely and Azure FHIR services.
GoodtoHave Skills:
- Familiarity with healthcare payer systems and regulatory standards like HIPAA.
- Knowledge of programming languages such as .NET and Python along with healthcare standards like HL7 V2/V3 ADT FHIR and CCDA.
- Experience with other GCP and Azure cloud services.
Responsibilities: We are seeking an experienced Lead Data/Cloud Engineer to join our healthcare interoperability team. The ideal candidate will have a strong background in data engineering cloud infrastructure (primarily GCP and Azure) ETL processes and API development. Your role will focus on managing and optimizing data pipelines across multicloud environments and onpremises infrastructure.
Key responsibilities include:
- Design build and maintain scalable data pipelines using StreamSets Azure Data Factory and GCPrelated services.
- Manage multicloud environments with 50% on GCP 25% on Azure and 25% onpremises infrastructure.
- Lead API development within the .NET framework with some work in Python.
- Develop and maintain event streaming services using Kafka and StreamSets.
- Administer databases including Snowflake MongoDB and FHIR servers (Firely Azure FHIR).
- Ensure compliance with healthcare standards such as UDAP CDEX FHIR HL7 and ADT.
- Collaborate with internal and external applications to ensure seamless integration using healthcare protocols such as FHIR CCDA HL7 etc.
- Implement and manage API gateways using tools like Apigee and Datapower.
- Provide technical guidance and mentorship to junior engineers.
- Monitor and optimize the performance and scalability of data solutions.
- Stay updated with the latest healthcare interoperability trends and cloud technologies.
Educational Qualifications:
- Bachelors or Masters degree in Engineering Computer Science or a related field.
- Technical certifications in multiple technologies are a plus.