Technical Lead / Data Engineering Specialist (Databricks Azure Data Lake Python)
Greenfield IN 46140 (Onsite)
12 months Contract
Pay rate: Market/Flexible
We are hiring a highly skilled Technical Lead/Data Engineering Specialist with extensive experience in Cloud technologies DevOps development practices Data Engineering to support and enhance RDAP initiatives.
Key Responsibilities:
- Design develop and maintain Databricks Lakehouse solutions sourcing from Cloud platforms such as Azure Synapse and GCP
- Implement and manage DevOps and CICD workflows using tools like GitHub
- Apply best practices in testdriven development code review branching strategies and deployment processes
- Build manage and optimize Python packages using tools like setup poetry wheels and artifact registries
- Develop and optimize data pipelines and workflows in Databricks utilizing PySpark and Databricks Asset Bundles
- Manage and query SQL databases (Unity Catalog SQL Server Hive Postgres)
- Implement orchestration solutions using Databricks Workflows Airflow and Dagster
- Work with eventdriven architectures using Kafka Azure Event Hub and GoogleC4 Cloud Pub/Sub
- Develop and maintain Change Data Capture (CDC) solutions using tools like Debezium
- Extensive experience in design and implementation of data migration projects specifically involving Azure Synapse and Databricks Lakehouse
- Manage cloud storage solutions including Azure Data Lake Storage and Google Cloud Storage
- Configure and manage identity and access solutions using Azure Active Directory including AD Groups Service Principals and Managed Identities
- Effective interactions in Customer for understanding requirements participating in design discussions and translating requirements into deliverables by working with the development team at Offshore; Effective in collaborating with crossfunctional teams across development operations and business units; Strong interpersonal skills to build and maintain productive relationships with team members
- ProblemSolving and Analytical Thinking Capability to troubleshoot and resolve issues efficiently; Analytical mindset for optimizing workflows and improving system performance
- Ability to convey complex technical concepts in a clear and concise manner to both technical and nontechnical stakeholders; Strong documentation skills for creating process guidelines technical workflows and reports
Technologies & Skills & Experience:
- Databricks (PySpark Databricks Asset Bundles)
- Python package builds(setup poetry wheels artifact registries)
- Open File Formats (Delta/Parquet/Iceberg/etc )
- SQL Databases (Unity Catalog SQL Server Hive Postgres)
- Orchestration Tools(Databricks Workflows Airflow Dagster)
- Azure Data Lake Storage Azure Active Directory (AD groups Service Principles Managed Identities)
- Secondary/Other Skills/Good To have Kafka Azure Event Hub Cloud Pub/Sub; Change Data Capture (Debizum) and Google Cloud Storage
- Bachelors Degree good in Computer Science Information Technology or related with 12 years of experience
Best & regards
Asra Mohammad
Talent Acquisition Specialist