Work Location Options:
- 100% Remote (EST time zone)
- Phoenix Arizona (AZ time)
Interview Process: 2 rounds via video. A technical architect from the team will interview and the questions will be technical as we are looking for someone to join and get started.
Project: Value Based Reimbursement
- There is need to ingest critical claims datasetsv(real time) into the cloud for Value-Based Reimbursement adds another layer of importance.
- Enhancements in encounter accuracy metrics and the encounter submission process through claims dataset ingestion into OnePaaS (Cloud) for all lines of business are not just desirable; they are essential for aligning with UHC's goals with delegated providers, impacting provider RAF (Risk Adjustment Factor) scores.
- Additionally, the 15-day lag from the clearinghouse causing delays in provider outreach for rejected encounters is not just an inconvenience; it is a critical issue that can be effectively addressed with the implementation of this new solution.
Team: 4 members (2 onshore and 2 offshore) and an architect to manage the delivery
Responsibilities:
- Accountable for data engineering lifecycle including research, proof of concepts, design, development, test, deployment and maintenance
- Design, develop, implement and run cross-domain, modular, optimized, flexible, scalable, secure, reliable and quality data solutions that transform data for meaningful analyses and analytics while ensuring operability
- Design, develop, implement and run data solutions and implement data pipelines with petabyte scale of data
- Layer in instrumentation in the development process so that data pipelines can be monitored. Measurements are used to detect internal problems before they result into user visible outages or data quality issues
- Build processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues
- Embrace continuous learning of engineering practices to ensure industry best practices and technology adoption, including DevOps, Cloud and Agile thinking
- Contribution to our industry community and strive to reuse and share components wherever possible across the organization
- Maintain high quality documentation of data definitions, transformations, and processes to ensure data governance and security
- Identifies solutions to non-standard requests and problems
- Solves moderately complex problems and/or conducts moderately complex analyses
- Ingest data from Hadoop soruces to cloud.
- Design and develop ETL workflows for migration of data from on-premise to Azure cloud
- Develop CICD wokflows on Azure for continuous deployment for projects.
- Work on databricks, Azure data factory to migrate SSIS workflows into cloud.
Required:
- Expert in Python and React frame works.
- Expert in Terraform and Azure Infrastructure.
- At least 8 years of active development experience with python and .NET
- At least 8 years of active development experience with UI technologies like react.
Preferred:
- At least 5 years on experience on cloud
- Python or Scala experience
- Databricks and ADF experience
- UI or API build experience
- Real time integration (Kafka)
Required Skills : React,Python
Additional Skills : Software Developer