Job Title: Data Engineer
Location: Seattle WA (Onsite)
Duration: 6 Months
Implementation Partner: Blue.cloud
End Client: To be disclosed
Key Responsibilities:
- Develop optimize and maintain data pipelines using Azure Data Factory (ADF) DBT Labs Snowflake and Databricks.
- Create reusable jobs and a configurationbased integration framework to optimize development and scalability.
- Manage data ingestion for structured and unstructured data:
- Landing/Lakehouse: ADLS.
- Sources: ADLS Salesforce SharePoint Document Libraries.
- Partner Data: DHS IHME WASDE etc.
- Implement and optimize ELT processes sourcetotarget mapping and transformation logic in tools like DBT Labs Azure Data Factory Databricks Notebooks and Snow SQL.
- Collaborate with data scientists analysts data engineers report developers and infrastructure engineers for endtoend support.
- Codevelop CI/CD best practices automation and pipelines with infrastructure engineers for code deployments using GitHub Actions.
- Automate sourcetotarget mappings data pipelines and data lineage in Collibra.
Required Experience:
- Handson experience building pipelines with ADF Snowflake Databricks and DBT Labs.
- Expertise in Azure Cloud with integration experience involving Databricks Snowflake and ADLS Gen2.
- Proficient in data warehousing and lakehouse concepts including ELT processes Delta Tables and External Tables for structured/unstructured data.
- Experience with Databricks Unity Catalog and datasharing technologies.
- Strong skills in CI/CD tools (Azure DevOps GitHub Actions) and version control systems (GitHub).
- Proven crossfunctional collaboration and technical support experience for data scientists report developers and analysts.