JOb ID 750974
Job Title :..Big Data Engineer
Location : Remote Bismarck ND
Duration : 07 Months
Client: State of NDNDUS
JD
Develops engineers maintains runs tests evaluates and implements big data infrastructure projects tools and solutions working with the latest database technologies to get results from vast amounts of data quickly.
IMPORTANT NOTES:
- This position will be fully remote. Candidates must be living and working within the continental US and would be expected to work the clients standard Central Time Zone business hours.
- This position is expected to work 35 hours per week.
- The rate on the Details tab is the CLIENTs bill rate not the Vendor rate you are able to bill for the position. The maximum VENDOR RATE you can bill for this position is specified within the Questions section of the requisition.
- Please read fully and respond appropriately to each item in the Questions section.
The Big Data Engineer is a vital member of a collaborative team responsible for designing engineering maintaining testing evaluating and implementing big data infrastructure tools projects and solutions for the North Dakota University System (NDUS).This role involves working closely with the team to leverage cuttingedge database technologies for the swift retrieval of results from vast datasets. The engineer will select and integrate big data frameworks and tools to meet specific needs and manage the entire lifecycle of large datasets to extract valuable insights.
Key Responsibilities:
- Design and implement scalable big data solutions tailored to NDUSs needs.
- Maintain and enhance existing big data infrastructures to meet NDUSs unique requirements.
- Test and evaluate new big data technologies and frameworks for compatibility with NDUS systems and goals.
- Collect store process manage analyze and visualize large datasets to derive actionable insights.
- Collaborate with team members to integrate big data solutions with existing NDUS systems.
- Ensure data integrity and security across all platforms used within NDUS.
- Develop and optimize data pipelines for ETL/ELT processes specific to NDUSs data needs.
- Document technical solutions and maintain comprehensive records in line with NDUS standards and protocols.
- Stay updated with the latest trends and advancements in big data technology relevant to NDUSs strategic initiatives.
Required Qualifications:
- Thorough understanding of cloud computing technologies including IaaS PaaS and SaaS implementations.
- Skilled in exploratory data analysis (EDA) to support ETL/ELT processes.
- Proficiency with Microsoft cloud products including Azure and Fabric.
- Experience with tools such as Data Factory and Databricks.
- Ability to script in multiple languages with a strong emphasis on Python and SQL.
Preferred Qualifications:
- Experience with data visualization tools.
- Proficiency with Excel and Power BI.
- Familiarity with Delta Lake.
- Knowledge of Lakehouse Medallion Architecture.
Skill | Required / Desired | Amount | of Experience |
IaaS PaaS or SaaS data or AI implementations within Microsoft Azure | Required | 3 | Years |
Exploratory Data Analysis (EDA): Proficiency in EDA techniques to support ETL/ELT processes | Required | 3 | Years |
Implementation of Development and Production Workflows in Azure Data Factory and Databricks | Required | 2 | Years |
Strong scripting skills in Python and SQL | Required | 3 | Years |
Expert proficiency of data engineering creation in Microsoft Fabric | Required | 1 | Years |
Expert proficiency with Excel and Power BI | Highly desired | 2 | Years |
Expert proficiency of Delta Lake format and protocol | Highly desired | 1 | Years |
Expert understanding of the Data Lakehouse Medallion Architecture | Highly desired | 1 | Years |