Description:
- The candidates will perform Data Science Developer work by creating enhancing maintaining supporting and sustaining existing pipelines analytics models and data products for the Ministry of Transportation and Ministry of Labour Immigration Training and Skills Development.
Specific Deliverables:
- Deliverables expected to be produced could include:
- Creating enhancing maintaining and supporting structures for storage of data in formats that are suitable for consumption in analytics solutions.
- Automation of data pipelines used to ingest prepare transform and model data for use in analytics products.
- Creating enhancing maintaining and supporting dashboards and reports.
- Creating enhancing maintaining and supporting analytics environments and implementing new technology to improve performance simplify architecture patterns and reduce cloud hosting costs.
- Knowledge transfer sessions and documentation for technical staff related to architecting designing and implementing continuous improvement enhancements to analytics solutions. These sessions will be held as needed and on a case by case basis that involve walkthroughs of documentation code and environment setups.
Requirements
Experience and Skill Set Requirements:
Must Haves:
- 35 year experience with Azure Databricks Lakehouse
- 35 year experience with Azure Data Factory
- 35 year experience with Python
- 35 year experience with SQL
Skill Set Requirements:
Data Storage and Preparation:
- The candidate must demonstrate their experience with Azure Storage Azure Data Lake Azure Databricks Lakehouse and Azure Synapse structures in real world implementations
Data Pipelines:
- The candidate must demonstrate their experience with automating data pipelines using appropriate Microsoft Azure Platform/Technologies (Python Databricks and Azure Data Factory)
Data Analytics:
- The candidate must demonstrate their experience with Power BI reports and dashboards
Knowledge Transfer:
- The candidate must demonstrate experience in conducting knowledge transfer sessions and building documentation for technical staff related to architecting designing and implementing end to end analytics solutions
Experience and Skill Set Requirements: Must Haves: 3-5+ year experience with Azure Databricks Lakehouse 3-5+ year experience with Azure Data Factory 3-5+ year experience with Python 3-5+ year experience with SQL Skill Set Requirements: Data Storage and Preparation: The candidate must demonstrate their experience with Azure Storage, Azure Data Lake, Azure Databricks Lakehouse, and Azure Synapse structures in real world implementations Data Pipelines: The candidate must demonstrate their experience with automating data pipelines using appropriate Microsoft Azure Platform/Technologies (Python, Databricks and Azure Data Factory) Data Analytics: The candidate must demonstrate their experience with Power BI reports and dashboards Knowledge Transfer: The candidate must demonstrate experience in conducting knowledge transfer sessions and building documentation for technical staff related to architecting, designing, and implementing end to end analytics solutions