Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailJob Title: Principal Data Engineer (Technical Architect)
Location requirements/Schedule: Remote
Location/s: : Remote
Start Date: ASAP
Bonus/Benefits: NO
Type of contract: C2C W2 Contract to Hire C2C
Duration: 6 months to start
Visa: USC GC
How many interviews: 2
Interview Process: Video
Important Skills: See Jd
Must Haves: See JD
Certifications must have or nice to have: Nice to have.
Job Description
**Primary Responsibilities:**
Partner in the design development and communication of technology patterns and standards needed to manage healthcare data
Provide technical leadership guidance and training to team of data engineers on individual and group levels
Collaborate with Product partners to assist in prioritization and delivery of cloudbased data solutions
Support Agile development practices including refining user stories and collaborating with product
Implement data processes and pipelines incorporating product requirements with data storage lifecycle accessibility performance and security standards
Solve complex data development issues within data development approaches
Work with peers across system to ensure alignment on data capabilities and software development lifecycle (SDLC)
**Required Qualifications:**
Undergraduate degree or equivalent experience
5 years PySpark & Python data development and testing experience
5 years of solid SQL or NoSQL experience
3 years of experience designing and implementing efficient endtoend flexible data pipelines
3 years of experience with both batch and near real time data architectures
Experience developing and implementing using parallel processing software strategies
Experience with cloudnative data development
**Preferred Qualifications:**
Degree in engineering mathematics computer science software development or other technical field
Experience working in the Agile Development Framework
Experience working with healthcare data
Experience developing and operating Databricks jobs as data pipelines
Experience working with Data Lake or Delta Lake architecture
Knowledge of secondary tooling used for/with data pipelines (e.g. Kafka Terraform Jenkins Docker)
Superior attention to detail
Solid personal drive to continually evolve the current state to improve development efficiency data quality and overall pipeline process improvement
Unique balance of disciplined analytical thinking creative passion and a willingness to be handson
Excellent problemsolving skills
Excellent software documentation skills
Excellent teamwork skills communication skills and ability to interact and influence across diverse groups of technical and nontechnical people at all levels
Good to have: ETL Cloud Microservices Kafka DevOps Open Source Springboot
Remote