drjobs Data Engineer

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Alexander City - USA

Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

JOB TITLE: Sr Data Engineer AZURE

EXPECTED PAY: : $50 Hour W2 $60 C2c

CLIENT: Washington Water

LINKEDIN Matching to the resume is mandatory for submissions.

  • Need 3 references for every sub (Name Title Organization Phone/Email)
  • Need the year completed for diplomas and certs

Location: 14501 Sweitzer Lane Laurel MD 20707 LOCALS ONLY

  • Hours 85pm
  • 3 days in office 2 days remote

Duration: 2year contract

Anticipated Start Date: November 15 2024

Work Hours: 8 to 5 plus onehour lunch

Work Schedule Hybrid

Estimated Number of Consultants: 1

Background check: Yes

Vaccination required: No

Interview Process/# of Rounds: 1

Scope Of Work:

  • Build and manage scalable data pipelines and solutions in Azure utilizing Synapse Analytics and Microsoft Fabric to support analytics needs.
  • The Senior Data Engineer with expertise in Azure Synapse Analytics and Microsoft Fabric will design develop and implement scalable data solutions to support analytics and reporting needs.
  • The role involves creating optimizing and managing data pipelines to efficiently move and transform data across the Azure ecosystem.
  • The candidate will be responsible for setting up and managing data lakes to store large volumes of structured and unstructured data ensuring high availability and security. They will collaborate with cross1functional teams to gather data requirements and create efficient scalable architectures.
  • Strong experience in ETL development data modeling and cloud technologies like Azure Data Factory Azure Data Lake and Synapse Analytics is essential.
  • The candidate will also ensure data quality security and compliance with governance standards.

  1. Create and optimize complex data pipelines using Azure Data Factory Synapse Analytics and other Azure tools to extract transform and load data efficiently.
  2. Implement and maintain Azure Data Lake solutions to store large volumes of structured and unstructured data ensuring scalability performance and security.
  3. Integrate data from various sources including relational databases NoSQL databases APIs and flat files into the Azure environment for analysis and reporting
  4. Design and develop robust data architectures optimizing for performance and scalability in Azure Synapse Analytics and Azure Data Lake environments.
  5. Develop efficient ETL/ELT processes using Azure Data Factory or other Azure tools to ensure timely and accurate data loading and transformation.
  6. Ensure data pipelines run smoothly by monitoring troubleshooting and resolving issues to minimize downtime and data inconsistencies.
  7. Continuously optimize data pipelines and query performance especially within Azure Synapse to handle large data sets and complex transformations efficiently.
  8. Work closely with data scientists analysts and business teams to understand data requirements and deliver scalable data solutions that support analytics needs.
  9. Implement and enforce security best practices ensuring data lakes pipelines and analytics solutions comply with Azure security standards and data governance policies.
  10. Design and implement logical and physical data models that support high1performance querying and reporting within Azure Synapse.
  11. Implement data quality checks data validation processes and error handling within data pipelines to ensure accuracy and consistency of data.
  12. Ensure adherence to data governance frameworks managing data lineage metadata and ensuring compliance with organizational and regulatory requirements.
  13. Implement data partitioning and indexing strategies to improve query performance within data lakes and Synapse.
  14. Automate data ingestion transformation and processing tasks to ensure efficient and scalable data workflows within the Azure environment.
  15. Create and maintain detailed documentation for data architectures pipelines processes and data models ensuring transparency and ease of maintenance.
  16. Provide technical guidance and mentorship to junior data engineers sharing best practices and ensuring adherence to highquality engineering standards.
  17. Monitor resource utilization in Azure environments planning for future data growth and ensuring efficient use of cloud resources.
  18. Strong knowledge of Medallion architecture.
  19. Experience in setting up parquet and delta file structures.
  20. Experience in working with nonstructured data sources.
  21. Strong knowledge of consuming and exposing data from various data sources like XML JSON etc.
  22. Continuously stay informed on the latest features and best practices in Azure Synapse Analytics Microsoft Fabric and the Azure ecosystem implementing improvements as needed.
  23. Strong knowledge of python for creating and scheduling data pipelines
  24. Implement realtime data ingestion and processing pipelines using technologies like Azure Stream Analytics Event Hubs.
  25. Design and implement a data mesh architecture to support decentralized data ownership and selfservice data infrastructure ensuring scalable and flexible data management across the organization.
  26. Architect and manage multicloud data solutions integrating data across different cloud platforms (e.g. AWS OCI) with Azure Synapse for a unified data and analytics ecosystem.
  27. Design and manage hybrid data architectures that integrate on1premises data centers with Azure cloud environments ensuring seamless data movement and synchronization between cloud and on1prem systems.
  28. Utilize advanced data cataloging tools such as Azure Purview to create an enterprisewide data catalog enabling efficient data discovery and usage across various teams.
  29. Create and automate endtoend machine learning pipelines that integrate data ingestion feature engineering model training and deployment using Azure ML Python (scikitlearn TensorFlow PyTorch) and Azure Synapse Analytics.
  30. Utilize Pythonbased data augmentation techniques or synthetic data generation (e.g. GANs or SMOTE) to enrich datasets for machine learning training especially in cases where data is limited or imbalanced.

Preferred Experience/Qualification/Knowledge Skills

  1. Education: Bachelors Degree in Information Systems Computer Science or related scientific or technical field and three (5) years minimum of relevant experience.
  2. General Experience:
    • Work Experience: 5 years of experience designing and implementing data solutions and creating data pipelines at enterpriselevel applications.
    • Industry Knowledge: Preferred to have experience in water and wastewater industry understanding of oracle utility applications.
    • Project Experience: Demonstrated experience working on largescale data projects in diverse team environments with a focus on analytics business intelligence and enterprise systems
  3. Specialized Experience
    • Data Modeling: Extensive experience with data modeling and database design.
    • Enterprise Analytics: Proven expertise in implementing enterprise1wide analytics and business intelligence solutions including data integration from multiple systems into a single data repository
  • Skillset Database & Data Structures: Deep understanding of database design principles SQL PL/SQL and Oracle database management systems including performance optimization and troubleshooting.
  • Data Governance & Quality: Familiarity with data governance frameworks ensuring data integrity quality and security within an enterprise context.
  • Data Lakes: Strong experience in creating data lakes and data warehouses.
  • Python: Strong knowledge of writing python code to create and manage data pipelines
  • Communication & Collaboration: Excellent verbal and written communication skills with the ability to work closely with stakeholders to translate business needs into technical solutions.
  • ProblemSolving: Strong analytical skills and problemsolving abilities especially when working with large complex datasets

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.