Job Description:
Responsibilities:
Overall 7 8 years of experience with a minimum 5 years of relevant professional work experience
- Design implement and maintain data pipelines for data ingestion processing and transformation in Azure.
- Work together with data analysts to understand the needs for data and create effective data workflows.
- Create and maintain data storage solutions including Azure SQL Database Azure Data Lake
- Utilizing Azure Data Factory to create and maintain ETL (Extract Transform Load) operations using ADF pipelines.
- HandsOn Experience working on Data Bricks for implementing Transformations and Delta Lake
- HandsOn Experience working on Serverless SQL Pool Dedicated SQL Pool
- Use ADF pipelines to orchestrate the end to end data transformation including the execution of DataBricks notebooks.
- Should have experience working on Medallion Architecture
- Experience working on CI/CD pipelines using Azure DevOps
- Attaching both ADF and ADB to DevOps
- Creating and managing Azure infrastructure across the landscape using Bicep
- Implementing data validation and cleansing procedures will ensure the quality integrity and dependability of the data.
- Improve the scalability efficiency and costeffectiveness of data pipelines.
- Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data.
- Good to have exposure on Power BI Power Automate
- Good to have knowledge on Azure fundamentals
Desirable skills
- Good ability to anticipate issues and formulate remedial actions
- Sound interpersonal and team working skills
- Sound Knowledge of unit testing methodologies and frameworks
- Should be able to work independently
- Good communication skills
Remote Work :
No