drjobs Interop- Data Engineer

Interop- Data Engineer

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Alexander City - USA

Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

Sr Interop Data Engineer
Location:
Remote
Experience Required: 9 Years
Education: Engineering Degree (BE/ME/BTech/MTech/BSc/MSc)
Technical certification in multiple technologies is desirable.

Whats in it for you

As a Sr Interop Data Engineer youll join an Agile team focused on building healthcare applications and implementing new features. Youll play a key role in developing interoperable solutions ensuring high performance and scalability while adhering to industry best practices.

Key Responsibilities:

  • Design develop and maintain APIs with a focus on .NET to ensure high performance and scalability.
  • Implement and manage interoperability standards such as HL7 V2/V3 ADT FHIR and CCDA.
  • Utilize Firely and Azure FHIR services for effective data integration and interoperability.
  • Collaborate with crossfunctional teams to gather requirements and deliver business and technical solutions.
  • Ensure data quality integrity and security across all interoperability processes.
  • Provide technical leadership and mentorship to junior engineers.
  • Stay updated with industry trends and leverage new technologies to enhance interoperability practices.

Mandatory Skills:

  • Strong knowledge of HL7 V2/V3 ADT FHIR and CCDA standards.
  • Experience with integration engines like Mirth Connect.
  • Proficiency in developing and managing APIs (Python minimal Node/React.js).
  • Experience with Firely and Azure FHIR services.

Good to Have:

  • Expertise in .NET development.
  • Experience with API gateways such as Apigee Hybrid and DataPower.
  • Knowledge of GitHub in Azure Cloud and Azure Repos in GCP.
  • Experience building and managing pipelines using StreamSets Scheduler CA Automatic Dollar Universe 6 Cron Jobs and Databricks Scheduler.
  • Familiarity with container orchestration using Kubernetes (k8s).
  • Proficiency in ETL functions using StreamSets or similar tools (e.g. Databricks).
  • Handson experience with databases such as MongoDB Snowflake and PostgreSQL.
  • Experience with event streaming tools like Confluent Kafka

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.