Sr Interop Data Engineer
Location: Remote
Experience Required: 9 Years
Education: Engineering Degree (BE/ME/BTech/MTech/BSc/MSc)
Technical certification in multiple technologies is desirable.
Whats in it for you
As a Sr Interop Data Engineer youll join an Agile team focused on building healthcare applications and implementing new features. Youll play a key role in developing interoperable solutions ensuring high performance and scalability while adhering to industry best practices.
Key Responsibilities:
- Design develop and maintain APIs with a focus on .NET to ensure high performance and scalability.
- Implement and manage interoperability standards such as HL7 V2/V3 ADT FHIR and CCDA.
- Utilize Firely and Azure FHIR services for effective data integration and interoperability.
- Collaborate with crossfunctional teams to gather requirements and deliver business and technical solutions.
- Ensure data quality integrity and security across all interoperability processes.
- Provide technical leadership and mentorship to junior engineers.
- Stay updated with industry trends and leverage new technologies to enhance interoperability practices.
Mandatory Skills:
- Strong knowledge of HL7 V2/V3 ADT FHIR and CCDA standards.
- Experience with integration engines like Mirth Connect.
- Proficiency in developing and managing APIs (Python minimal Node/React.js).
- Experience with Firely and Azure FHIR services.
Good to Have:
- Expertise in .NET development.
- Experience with API gateways such as Apigee Hybrid and DataPower.
- Knowledge of GitHub in Azure Cloud and Azure Repos in GCP.
- Experience building and managing pipelines using StreamSets Scheduler CA Automatic Dollar Universe 6 Cron Jobs and Databricks Scheduler.
- Familiarity with container orchestration using Kubernetes (k8s).
- Proficiency in ETL functions using StreamSets or similar tools (e.g. Databricks).
- Handson experience with databases such as MongoDB Snowflake and PostgreSQL.
- Experience with event streaming tools like Confluent Kafka