Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailDesign, develop and maintain the data architecture, data models and standards for various Data Integration & Data Warehousing projects in Snowflake, combined with other technologies.
Develop data solutions to enable new product & BI development for the enterprise and its multiple lines of business.
Develop and maintain documentation of the data architecture, data flows and data models.
Aid in the implementation of data governance.
Mentor and collaborate with other bright engineers (Data, Software, & Dev/Ops). Do you have what it takes? Required Experience:
A deep understanding of data architecture principles and data warehouse methodologies specifically Kimball or Data Vault.
Adept at ETL/ELT development and optimization.
Skilled in a database cloud technology.
5+ years of data architecture and data modeling experience.
Highly skilled in SQL.
Any object oriented, high-level programming language.
Bachelor's Degree in computer science or related field. Preferred Experience:
3+ years of hands-on architecting and building cloud data solutions in Snowflake.
3+ years of developing data solutions with Python. Proficient working with large, structured data sets, 1+billion rows/5+terabytes.
Demonstrated use and knowledge of Kafka.
Familiarity with Google Cloud Platform.
Strong understanding of Data Governance.
Preferred Certifications:
Snowflake
StreamSets
GCP
MDM Cloud Technologies - GCP Snowflake, BigQuery, GCP - data fusion, data flow SQL,Any relational Database, Postgre DB FHIR Knowledge
Knowledge of
HVR - optional How to do data transformation Setting up data pipeline using GCP
"Data Engineer - strong understanding of GCP Infrastructure, GCP Security, and GCP Data tools and applications.
Healthcare and FHIR experience is a huge plus.
Bigquery and how it compares to Snowflake is a plus."
Full Time