Position: Data Warehouse Database Administrator
Client: State of Utah
Department: UDC
Location: 8523 S. Redwood Rd West Jordan Utah 84088
Position ID: 133935
Projected Duration: 1 years from projected start date
Tentative interview dates: Tuesday September 3rd and Wednesday September 4th
Remote or onsite: 60% onsite 40% remote about 2 days onsite and 3 days remote
Job Description:
Scoring:
Technical Expertise 40%
Past Experience 40%
Cost 20%
Preferred/required skills: We are seeking a skilled professional who has proven experience working with databases in GCP
Data Warehouse Database Administrator
Are you a Database Administrator (DBA) looking for a great career doing important work that makes an impact The State of Utah Department of Government Operations Division of Technology Services (DTS) is looking for an experienced DBA to support the Department of Corrections.
DTS is looking for an experienced Database Administrator (DBA) working on SQL databases hosted on Windows and Linux servers and Google Cloud Platform (GCP) with a strong focus on BigQuery and GCP services. The ideal candidate will be responsible for establishing managing and optimizing our data warehouse infrastructure to support business intelligence and analytics initiatives. Your role will be critical in ensuring the reliability scalability and performance of our data systems. You will collaborate closely with data analysts engineers and business stakeholders to deliver actionable insights and support datadriven decisionmaking.
The chosen candidate will also provide database maintenance support and GCP administration supporting our data warehouse and analytic needs. This role is crucial for ensuring that the State of Utah remains competitive and capable of meeting the demands of our customers with efficient reliable and scalable database solutions.
Primary Duties & Responsibilities
Design create implement and manage BigQuery data warehouses including table structures stored procedures Linux shell scripts and optimizing query performance.
Develop and maintain data pipelines to ingest transform and load data from various sources such as onpremise Informix databases and eventually PostgreSQL from AWS into BigQuery in GCP.
Monitor and tune data warehouse performance including query optimization and debugging indexing and caching strategies.
Implement data security best practices manage access controls and ensure compliance with relevant regulations and standards.
Set up monitoring and alerting for data warehouse systems troubleshoot issues and ensure high availability and reliability.
Develop and enforce data governance policies data quality standards and best practices.
Work closely with data analysts engineers and business stakeholders to understand data requirements and deliver solutions that meet their needs.
Document data warehouse architecture processes and best practices. Provide training and support to team members and endusers.
Accommodate work that needs to be performed both during and after business hours.
Create manage and maintain secure database access user roles.
Other tasks as assigned.
Typical Qualifications
Proven expertise with BigQuery on Google Cloud Platform (GCP) including handson experience in designing and optimizing BigQuery datasets along with some experience working with Informix databases.
Extensive knowledge of database design and documentation.
Knowledge of the development and management of computer system databases and data warehouses.
Proficiency in SQL for data querying analysis and performance tuning.
Experience with data integration tools (e.g. Dataflow Apache Beam Airflow) and ETL processes.
Understanding of data security practices access controls and compliance requirements (e.g. GDPR HIPAA).
Experience with other data warehouse technologies (e.g. Redshift Snowflake) and cloud platforms (e.g. AWS GCP).
Knowledge of data modeling data warehousing concepts and best practices.
Familiarity with data visualization tools (e.g. Looker PowerBI)
Strong analytical research and organizational skills with impeccable attention to detail.
Understanding and experience with cloud services such as GCP and AWS.
System administration skills are desired.
Knowledge of CI/CD Pipelines.
Good understanding of GitHub database version control.
Ability to test and troubleshoot using appropriate methodologies and techniques.
Excellent written/verbal communication interpersonal and organizational skills required.
Understand the principles theories and practices of computer science.
Deal with people in a manner that shows sensitivity tact and professionalism.
In the event of a tie in the total score between candidates in the top/winning positions the technical score will be calculated (sans the cost factor) and the highest technical scorer of the tied candidates will be awarded the position. Should this method fail to yield an ultimate winner the candidates will be asked to participate in a best and final offer process.
The amount listed on this scope of work is a current project budget. Additional funding can be added by the requesting agency by amendment to the scope of work. Contractors must not allow work to continue that will cost more than the allowed budget without an amendment first being approved.
The position/project duration listed on the scope of work may be extended by the requesting agency by amendment to the scope of work for a time period up to a total of 5 years. Contractors must not allow any work to be done after the end date without an amendment first being approved.
PLEASE NOTE AND PROVIDE TO YOUR CONTRACTORS: All consultants hired will be required to follow the DTSDrug Free Workplace Policy and all Utah drug laws.
github database version control,ci/cd pipelines,data security,gcp,data quality,troubleshooting,performance tuning,computer science,sql querying,etl processes,data pipelines,data integration,bigquery,data warehousing,sql,communication skills,data governance,linux,cloud platforms,system administration,data visualization,database administration,database management,data modeling