Location: New York NY (Hybrid 3 days/week onsite)
Job type: 6month Contract
Hourly Rate: $70/hr (candidates requiring more than this will be considered but preference will be given to those accepting $70)
- We are seeking a highly skilled mid to senior level Data Engineer with previous handson experience in the Financial or Banking industry capable of hitting the ground running and quickly building scalable data solutions.
- The ideal candidate will have a strong background in Azure Cloud and will work within a crossfunctional Agile team of 15 people.
- This role requires someone who can collaborate closely with stakeholders to understand business requirements particularly for capturing revenue data for brokers.
- The engineer will also work with compliance and security teams to ensure all systems meet regulatory standards and be responsible for building a flexible data capture system while maintaining BAU (Business as Usual) functionality.
- This is a dynamic fastpaced environment where the ability to deliver highquality solutions quickly is critical.
Requirements
YOU MUST BE AUTHORIZED TO WORK IN THE U.S. WITHOUT SPONSORSHIP NOW AND IN THE FUTURE.
- Bachelors degree in computer science information systems or related field
- 5 years related experience
- Strong proficiency in Python programming language and experience in Extract Transform Load (ETL) processes including the ability to design and develop efficient ETL workflows to extract data from various sources transform it as per business requirements and load it into target systems.
- Handson experience working with Microsoft Azure cloud platform including familiarity with Azure services and tools and a good understanding of Azure architecture and best practices.
- Practical experience in designing and implementing data pipelines using Azure Data Factory performing data transformations and analytics using DataBricks and managing data storage and virtual machines in Azure.
- Strong understanding of data governance principles data quality management and data controls including the ability to implement data governance frameworks establish data quality standards and ensure compliance with data regulations and policies (e.g. GDPR CCPA).
- Experience in configuring monitoring and alerting mechanisms to ensure timely identification and resolution of issues in batch job execution.
- Experience in architecting scalable and flexible data pipelines with a focus on performance optimization.
- Skills in implementing data quality checks and monitoring to maintain data integrity throughout the pipeline.
- Previous experience working in the banking or financial services industry understanding specific data handling and compliance challenges.
- Ability to work effectively in a crossfunctional Agile team adapting to fastpaced project requirements.
- Strong communication and collaboration skills to work with stakeholders and teams translating business requirements into technical solutions.
YOU MUST BE AUTHORIZED TO WORK IN THE U.S. WITHOUT SPONSORSHIP NOW AND IN THE FUTURE. Bachelor's degree in computer science, information systems, or related field 5+ years related experience Strong proficiency in Python programming language and experience in Extract, Transform, Load (ETL) processes, including the ability to design and develop efficient ETL workflows to extract data from various sources, transform it as per business requirements, and load it into target systems. Hands-on experience working with Microsoft Azure cloud platform, including familiarity with Azure services and tools and a good understanding of Azure architecture and best practices. Practical experience in designing and implementing data pipelines using Azure Data Factory, performing data transformations and analytics using DataBricks, and managing data storage and virtual machines in Azure. Strong understanding of data governance principles, data quality management, and data controls, including the ability to implement data governance frameworks, establish data quality standards, and ensure compliance with data regulations and policies (e.g., GDPR, CCPA). Experience in configuring monitoring and alerting mechanisms to ensure timely identification and resolution of issues in batch job execution. Experience in architecting scalable and flexible data pipelines with a focus on performance optimization. Skills in implementing data quality checks and monitoring to maintain data integrity throughout the pipeline. Previous experience working in the banking or financial services industry understanding specific data handling and compliance challenges. Ability to work effectively in a cross-functional Agile team adapting to fast-paced project requirements. Strong communication and collaboration skills to work with stakeholders and teams translating business requirements into technical solutions. Apply at or send r sum to See more jobs at selectjobs.co. There are many more that are not posted with new opportunities daily. Share your r sum at and let a recruiter find a position for you.