Specific Deliverables
Deliverables expected to be produced could include:
- Creating enhancing maintaining and supporting structures for storage of data in formats that are suitable for consumption in analytics solutions.
- Automation of data pipelines used to ingest prepare transform and model data for use in analytics products.
- Creating enhancing maintaining and supporting dashboards and reports.
- Creating enhancing maintaining and supporting analytics environments and implementing new technology to improve performance simplify architecture patterns and reduce cloud hosting costs.
- Knowledge transfer sessions and documentation for technical staff related to architecting designing and implementing continuous improvement enhancements to analytics solutions. These sessions will be held as needed and on a case by case basis that involve walkthroughs of documentation code and environment setups.
Requirements
Data Storage and Preparation
- The candidate must demonstrate their experience with Azure Storage Azure Data Lake Azure Databricks Lakehouse and Azure Synapse structures in real world implementations
Data Pipelines
- The candidate must demonstrate their experience with automating data pipelines using appropriate Microsoft Azure Platform/Technologies (Python Databricks and Azure Data Factory)
Data Analytics
- The candidate must demonstrate their experience with Power BI reports and dashboards
Knowledge Transfer
- The candidate must demonstrate experience in conducting knowledge transfer sessions and building documentation for technical staff related to architecting designing and implementing end to end analytics solutions
Must Have:
- Experience with Azure Storage Azure Data Lake Azure Databricks Lakehouse and Azure Synapse structures.
- Experience with Python Databricks and Azure Data Factory
- Power BI reports and dashboards
Nice to have:
- Prior experience with high data / big data related projects.