Required Skills
4 years of enterprise data warehouse development with SQL Python Azure Data Factory (ADF) Azure Databricks Kafka and ideally Snowflake experience
Job Description
Looking for a Data Engineer with SQL Python ADF Databricks Kafka and ideally Snowflake experience to work 100% remote.
- Data Engineer to develop CI/D data pipelines and ETL processes to curate and transform pharmacy data from different sources (both on premise and cloud).
- Build large scale databases that are robust and secure to be stored in a data lake or data warehouse.
- Automate data workflows and processes including ingestion cleaning structuring formatting of data.
- Build engineering solutions that support ML/data science projects.
- Collaborate with Data Scientists and business partners to deploy machine learning models in production (preferred).
- Extensively work on different kinds of datasets including text voice images unstructured structured.
Requirements:
- 46 years of enterprise data warehouse development preferably on SQL or Snowflake
- 46 years of experience creating enhancing and maintaining ETL frameworks using ETL tools.
- Handson experience with SQL PL/SQL Python and/or Shell Scripting is a must.
- Experience migrating from RDBMS to Snowflake is a plus.
- Endtoend dataflow design and development experience is a plus.
- Handson experience with Azure Cloud including Azure Data Factory and Databricks.
- Kafka experience to setup/attach to Databricks.