Data Engineer Databricks
Location: Remote
Preference: W2 or 1099 candidates
Required Qualifications:
8 years of handson experience in data engineering/ETL using Databricks on AWS / Azure cloud infrastructure and functions.
Deep understanding of data warehousing concepts (Dimensional (starschema) SCD2 Data Vault Denormalized OBT) implementing highly performant data ingestion pipelines from multiple sources
Deep level of skills with Python / PySpark and SQL
Experience with CI/CD on Databricks using tools such as Jenkins GitHub Actions and Databricks CLI
Integrating the endtoend Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained.
Evaluating the performance and applicability of multiple tools against customer requirements
Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
Experience with Delta Lake Unity Catalog Delta Sharing Delta Live Tables (DLT).
Hands on experience developing batch and streaming data pipelines.
Nice to have: Databricks certifications
Nice to have: experience with AWS Redshift Snowflake Azure Synapse
Please share the Resumes
Email :
phone :