drjobs ETL Developer 8279-1214

ETL Developer 8279-1214

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Toronto - Canada

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

HM Note: This hybrid contract role is three (3) days in office


General Responsibilities
and nbsp;
Responsibilities:
Works in partnership with clients advising them on information technology in order to meet their business objectives or overcome problems work to improve structure and efficiency of an organizations I and amp;IT systems. The I and amp;IT Consultant may be used to provide strategic guidance to organizations with regard to Information Management and amp; IT technology IT infrastructures and the enablement of major business processes through enhancements to IT. Provides subject matter expertise in their field and highly expert technical assistance.

General Skills:
Acts as the technical advisor/expert on all aspects of a specific deliverable
Provide the quality assurance/quality control of specific deliverables
Anticipates and resolves problems to ensure that the deliverables are completed within budget to the highest quality meeting or exceeding expectations
Develops processes and procedures for implementing deliverables
Prepares reports and presentations including options recommendations implementation plans etc.
Works with clients to define the scope of a project and to determine requirements
Defines software hardware and network requirements
Analyzes I and amp;IT requirements giving independent and objective advice on the use of I and amp;IT
Designs tests installs and monitors new systems and develops solutions and implementation of new systems
Familiar with changemanagement principles and methodology
Knowledge and understanding of Information Management principles concepts policies and practices

Additional Responsibilities:
This role will focus on data architecture data warehousing data lakes and analytics. The individual will be designing developing maintaining and optimizing ETL (Extract Transform Load) processes in Databricks for data warehousing data lakes and analytics. and nbsp;The individual will work closely with data architects and business teams to ensure the efficient transformation and movement of data to meet business needs including handling Change Data Capture (CDC) and streaming data.
and nbsp;
and nbsp;
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Review business requirements familiarize with and understand business rules and transactional data model
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Define conceptual logical model and physical model mapping from data source to curated model and data mart.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Analyze requirements and recommend changes to the physical model.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Develop scripts for the physical model create database and/or delta lake file structure.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Access Oracle DB environments set necessary tools for developing solution.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Implement data design methodologies historical and dimensional models
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Perform data profiling assess data accuracy design and document data quality and master data management rules
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Functionality Review Data Load review Performance Review Data Consistency checks.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Help troubleshooting data mart design issues
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Review performance of ETL with developers and suggest improvements
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Participate in endtoend integrated testing for Full Load and Incremental Load and advise on issues
and nbsp;
Tools used are:
Azure Databricks Delta Lake Delta Live Tables and Spark to process structured and unstructured data.
Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zone in the data lake.
Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into FHIR.
and nbsp;
Data design
o and nbsp; and nbsp; and nbsp;Understand the requirements. Recommend changes to models to support ETL design.
o and nbsp; and nbsp; and nbsp;Define primary keys indexing strategies and relationships that enhance data integrity and performance across layers.
o and nbsp; and nbsp; and nbsp;Define the initial schemas for each data layer
o and nbsp; and nbsp; and nbsp;Assist with data modelling and updates of sourcetotarget mapping documentation
o and nbsp; and nbsp; and nbsp;Document and implement schema validation rules to ensure incoming data conforms to expected formats and standards
o and nbsp; and nbsp; and nbsp;Design data quality checks within the pipeline to catch inconsistencies missing values or errors early in the process.
o and nbsp; and nbsp; and nbsp;Proactively communicate with business and IT experts on any changes required to conceptual logical and physical models communicate and review timelines dependencies and risks.
Development of ETL and nbsp;strategy and and nbsp;solution for different sets of data modules
o and nbsp; and nbsp; and nbsp;Understand the Tables and Relationships in the data model.
o and nbsp; and nbsp; and nbsp;Create low level design documents and test cases for ETL development.
o and nbsp; and nbsp; and nbsp;Implement errorcatching logging retry mechanisms and handling data anomalies.
o and nbsp; and nbsp; and nbsp;Create the workflows and pipeline design
Development and testing of data pipelines with Incremental and Full Load.
o and nbsp; and nbsp; and nbsp;Develop high quality ETL mappings/scripts/notebooks
o and nbsp; and nbsp; and nbsp;Develop and maintain pipeline from Oracle data source to Azure Delta Lakes and FHIR
o and nbsp; and nbsp; and nbsp;Perform unit testing
o and nbsp; and nbsp; and nbsp;Ensure performance monitoring and improvement
Performance review data consistency checks
o and nbsp; and nbsp; and nbsp;Troubleshoot performance issues ETL issues log activity for each pipeline and transformation.
o and nbsp; and nbsp; and nbsp;Review and optimize overall ETL performance.
Endtoend integrated testing for Full Load and Incremental Load
Plan for Go Live Production Deployment.
o and nbsp; and nbsp; and nbsp;Create production deployment steps.
o and nbsp; and nbsp; and nbsp;Configure parameters scripts for go live. Test and review the instructions.
o and nbsp; and nbsp; and nbsp;Create release documents and help build and deploy code across servers.
Go Live Support and Review after Go Live.
o and nbsp; and nbsp; and nbsp;Review existing ETL process tools and provide recommendation on improving performance and reduce ETL timelines.
o and nbsp; and nbsp; and nbsp;Review infrastructure and remediate issues for overall process improvement
Knowledge Transfer to Ministry staff development of documentation on the work completed.
o and nbsp; and nbsp; and nbsp;Document work and share the ETL endtoend design troubleshooting steps configuration and scripts review.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;o and nbsp; and nbsp; and nbsp;Transfer documents scripts and review of documents to Ministry.

Skills
Experience and Skill Set Requirements

Must Have Skills
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;7 years using ETL tools such as Microsoft SSIS stored procedures TSQL and nbsp;
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;2 Delta Lake Databricks and Azure Databricks pipelines
o and nbsp; and nbsp; and nbsp;Strong knowledge of Delta Lake for data management and optimization.
o and nbsp; and nbsp; and nbsp;Familiarity with Databricks Workflows for scheduling and orchestrating tasks.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;2 years Python and PySpark and nbsp;
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Solid understanding of the Medallion Architecture (Bronze Silver Gold) and experience implementing it in production environments. and nbsp;
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Handson experience with CDC tools (e.g. GoldenGate) for managing realtime data.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;SQL Server Oracle
Experience:
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience of 7 years of working with and nbsp;SQL Server TSQL Oracle PL/SQL development or similar relational databases
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience of 2 years of working with and nbsp;Azure Data Factory Databricks and Python development
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience building data ingestion and change data capture using Oracle Golden Gate and nbsp;
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience in designing developing and implementing ETL pipelines using Databricks and related tools to ingest transform and store largescale datasets
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience in leveraging Databricks Delta Lake Delta Live Tables and Spark to process structured and unstructured data.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience working with building databases data warehouses and working with delta and full loads
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience on Data modeling and tools e.g. SAP Power Designer Visio or similar
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience working with SQL Server SSIS or other ETL tools solid knowledge and experience with SQL scripting
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience developing in an Agile environment
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Understanding data warehouse architecture with a delta lake
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Ability to analyze design develop test and document ETL pipelines from detailed and highlevel specifications and assist in troubleshooting.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Ability to utilize SQL to perform DDL tasks and complex queries
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Good knowledge of database performance optimization techniques
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Ability to assist in the requirements analysis and subsequent developments
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Ability to conduct unit testing and assist in test preparations to ensure data integrity
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Work closely with Designers Business Analysts and other Developers
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Liaise with Project Managers Quality Assurance Analysts and Business Intelligence Consultants
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Design and implement technical enhancements of Data Warehouse as required.
and nbsp;
and nbsp;
Technical Skills (70 points)
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience in developing and managing ETL pipelines jobs and workflows in Databricks.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Deep understanding of Delta Lake for building data lakes and managing ACID transactions schema evolution and data versioning.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience automating ETL pipelines using Delta Live Tables including handling Change Data Capture (CDC) for incremental data loads.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Proficient in structuring data pipelines with the Medallion Architecture to scale data pipelines and ensure data quality.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Handson experience developing streaming tables in Databricks using Structured Streaming and readStream to handle realtime data.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Expertise in integrating CDC tools like GoldenGate or Debezium for processing incremental updates and managing realtime data ingestion.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience using Unity Catalog to manage data governance access control and ensure compliance.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Skilled in managing clusters jobs autoscaling monitoring and performance optimization in Databricks environments.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Knowledge of using Databricks Autoloader for efficient batch and realtime data ingestion.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience with data governance best practices including implementing security policies access control and auditing with Unity Catalog.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Proficient in creating and managing Databricks Workflows to orchestrate job dependencies and schedule tasks.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Strong knowledge of Python PySpark and SQL for data manipulation and transformation.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience integrating Databricks with cloud storage solutions such as Azure Blob Storage AWS S3 or Google Cloud Storage.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Familiarity with external orchestration tools like Azure Data Factory
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Implementing logical and physical data models
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Knowledge of FHIR is an asset

Design Documentation and Analysis Skills (20 points)
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Demonstrated experience in creating design documentation such as:
o and nbsp; and nbsp; and nbsp;Schema definitions
o and nbsp; and nbsp; and nbsp;Error handling and logging
o and nbsp; and nbsp; and nbsp;ETL Process Documentation
o and nbsp; and nbsp; and nbsp;Job Scheduling and Dependency Management
o and nbsp; and nbsp; and nbsp;Data Quality and Validation Checks
o and nbsp; and nbsp; and nbsp;Performance Optimization and Scalability Plans
o and nbsp; and nbsp; and nbsp;Troubleshooting Guides
o and nbsp; and nbsp; and nbsp;Data Lineage
o and nbsp; and nbsp; and nbsp;Security and Access Control Policies applied within ETL
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Experience in FitGap analysis system use case reviews requirements reviews coding exercises and reviews.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Participate in defect fixing testing support and development activities for ETL
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Analyze and nbsp;and document solution complexity and interdependencies including providing support for data validation.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Strong analytical skills for troubleshooting problemsolving and ensuring data quality.
and nbsp;
and nbsp;
Communication and Leadership Skills (10 points)
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Ability to collaborate effectively with crossfunctional teams and communicate complex technical concepts to nontechnical stakeholders.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Strong problemsolving skills and experience working in an Agile or Scrum environment.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Ability to provide technical guidance and support to other team members on Databricks best practices.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Must have previous work experience in conducting Knowledge Transfer sessions ensuring the resources will receive the required knowledge to support the system.
and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp; and nbsp;Must develop documentation and materials as part of a review and knowledge transfer to other members.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.