100% TELECOMMUTE
****Work Hours: 9 am5pm Hawaii Standard time****
Interview Process: 3 rounds via video.
Description: Optum is currently seeking a handson Senior Data Engineer to support our HI EDW within the Enterprise Datawarehouse and Analytics group.
- The Data Engineer will work with large healthcare datasets and will translate clients business requirements into enterprise systems applications or process designs for large complex health data solutions.
- The role will drive and support initiatives for the HI EDW as well as participate in the wider EDW groups areas of data usage and governance information management privacy and security SOA data analytics and visualization and information modeling.
Ideal Background: Strong technical experience in ADF and Snowflake
TOP REQUIREMENTS:
- Snowflake and Azure Data Factory are the key skills for this role.
- 5 years of Data engineering experience with a focus on Data Warehousing
- 2 years of experience creating pipelines in Azure Data Factory (ADF)
- 5 years developing ETL using Informatica PowerCenter SSIS Azure Data Factory or similar tools.
Required:
- 5 years of Data engineering experience with a focus on Data Warehousing
- 2 years of experience creating pipelines in Azure Data Factory (ADF)
- 5 years developing ETL using Informatica PowerCenter SSIS Azure Data Factory or similar tools.
- 5 years of experience with Relational Databases such as Oracle Snowflake SQL Server etc.
- 3 years of experience creating stored procedures with Oracle PL/SQL SQL Server TSQL or Snowflake SQL
- 2 years of experience with GitHub SVN or similar source control systems
- 2 years of experience processing structured and unstructured data.
- Experience with HL7 and FHIR standards and processing files in these formats.
- 3 years analyzing project requirements and developing detailed specifications for ETL requirements.
- Excellent problemsolving and analytical skills with the ability to troubleshoot and optimize data pipelines.
- Ability to adapt to evolving technologies and changing business requirements.
- Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science Mathematics/Statistics Analytics Business
Preferred:
- 2 years of batch or PowerShell scripting
- 2 years of experience with Python scripting.
- 3 years of data modeling experience in a data warehouse environment
- Experience or familiarity with Informatica Intelligent Cloud Services (specifically Data Integration)
- Experience designing and building APIs in Snowflake and ADF (e.g. REST RPC)
- Experience with State Medicaid / Medicare / Healthcare applications
- Azure certifications related to data engineering or data analytics.
Required Skills : ETLData Warehouse
Basic Qualification :
Additional Skills : Data Engineer
Background Check : No
Drug Screen : No