- Initial contract duration for 4 months with further extension of 12 months
- Number of extensions: 1
- Location: ACT
- Must have baseline clearance
- Working Arrangement: Hybrid
- This role is Canberra Based and candidates must be in the office a minimum of 3 days per week
- Maximum hours
40 hours per week
About the Role:
SoftLabs is seeking Senior Data Engineer for ICT Labour hire at their technology consulting based in ACT.
Job details
The Department of Agriculture Fisheries and Forestry (DAFF) is looking for a Data Engineer to join the Digital Transformation Program in the Australian Bureau of Agricultural and Resource Economics and Sciences (ABARES) to work across several data and analytics platforms. The candidate will develop and optimise data pipelines in Azure Databricks with a strong focus on Python and SQL. The candidate will have expertise in Azure Data Factory Azure DevOps CI/CD and Git version control as well as a deep understanding of Kimball dimensional modelling and Medallion architecture. This role requires strong collaboration skills to translate business requirements into effective technical solutions.
Key duties and responsibilities
- Develop optimise and maintain data pipelines using Python and SQL within Azure Databricks Notebooks.
- Design and implement ETL/ELT workflows in Azure Data Factory ensuring efficient data transformation and loading.
- Apply Kimball dimensional modelling and Medallion architecture best practices for scalable and structured data solutions.
- Collaborate with team members and business stakeholders to understand data requirements and translate them into technical solutions.
- Implement and maintain CI/CD pipelines using Azure DevOps ensuring automated deployments and version control with Git.
- Monitor troubleshoot and optimise Databricks jobs and queries for performance and efficiency.
- Work closely with data analysts and business intelligence teams to provide wellstructured highquality datasets for reporting and analytics.
- Ensure compliance with data governance security and privacy best practices.
- Contribute to code quality improvement through peer reviews best practices and knowledge sharing.
Preferred Skills & Experience:
- Strong proficiency in Python for data transformation automation and pipeline development.
- Advanced SQL skills for query optimisation and performance tuning in Databricks Notebooks.
- Handson experience with Azure Databricks for largescale data processing.
- Expertise in Azure Data Factory for orchestrating and automating data workflows.
- Experience with Azure DevOps including setting up CI/CD pipelines and managing code repositories with Git.
- Strong understanding of Kimball dimensional modelling (fact and dimension tables star/snowflake schemas) for enterprise data warehousing.
- Knowledge of Medallion architecture for structuring data lakes with bronze silver and gold layers.
- Familiarity with data modelling best practices for analytics and business intelligence.
- Strong analytical and problemsolving skills with a proactive approach to identifying and resolving issues.
- Excellent collaboration and communication skills with the ability to engage both technical and business stakeholders effectively.
Essential Criteria
1. Demonstrated experience in .NET and Azure application development including Function Apps Logic Apps and Web Apps.
2. Proven ability to design and implement data pipelines using Azure Data Factory with a preference for candidates experienced in integrating data from Unity Catalog.
3. Experience working collaboratively within agile development teams and with DevOps practices
4. Strong problemsolving skills with the ability to analyse and resolve complex integration challenges.
Application Deadline: Monday 17 February 2025
Expected Start Date: Monday 03 March 2025
Job Types: Contract
Rate: As per Australian Market Standards
If you are interested in this position please click Apply with your resume in WORD and send your details for review. If you wish to have a confidential discussion call us onorfor more information