Job ID: 749745
Title: Databricks Architect
Client: State of North Carolina NCDOT
Location: Raleigh NC (Hybrid)
Databricks Certifications in MUST
Job Description
**The candidate must come onsite on the first day to collect equipment.**
**All candidates must be local to the Triangle region of North Carolina and posting may require up to 12 days per month in a Triangle area office for meetings.**
NCDITTransportation Database Team seeks a Databricks Administrator/Architect with proven skills for a 12month engagement for creation/tuning & support of the Databricks environment. This position will be responsible for developing and designing the Databricks environment at NCDITT. This individual will work with internal staff to plan/design/maintain the Databricks environment and recommend changes needed to accommodate/grow as our business needs dictate. This individual will facilitate changes through DITT s change process and work very closely with the DBA & Development Staff regarding all aspects of the design and planning of the Databricks environment.
Responsibilities:
- Provide mentorship guidance overall knowledge share and support to team members promoting continuous learning and development.
- Oversee the design implementation and maintenance of Databricks clusters.
- Ensure the platform s scalability performance and security.
- Provide escalated support and troubleshooting to users.
- Oversee maintenance of rolebased access to data and features in the Databricks Platform using Unity Catalog.
- Review clusters health check and best practices implementation.
- Review and maintain documentation for users and administrators.
- Design and implement tailored data solutions to meet customer needs and use cases spanning from ingesting data from APIs building data pipelines analytics and beyond within a dynamically evolving technical stack.
- Work on projects involving onprem data ingestion into Azure using ADF.
- Build data pipelines based on the medallion architecture that clean transform and aggregate data from disparate sources.
Skill
Required / Desired
Required Experience
Candidate Experience
Last Used
Extensive handson experience implementing Lakehouse architecture using Databricks Data Engineering platform SQL Analytics Delta Lake Unity Catalog
Required
5
Strong understanding of Relational & Dimensional modeling.
Required
5
Demonstrate proficiency in coding skills Python SQL and PySpark to efficiently prioritize perf security scalability robust data integrations.
Required
6
Experience implementing serverless realtime/near realtime arch. using Cloud (i.e. Azure AWS or GCP Tech Stack) and Spark tech (Streaming & ML)
Required
2
Experience Azure Infra config (Networking architect and build large data ingestion pipelines and conducting data migrations using ADF or similar tech
Required
4
Experience working w/ SQL Server features such as SSIS and CDC.
Required
7
Experience with Databricks platform security features Unity Catalog and data access control mechanisms.
Required
2
Experience with GIT code versioning software.
Required
4
Databricks Certifications
Desired
sql analytics,databricks,sql server,cdc,ssis,ml,azure,python,sql,unity catalog,lakehouse architecture,cloud (azure, aws, gcp),git,azure data factory (adf),spark,pipelines,streaming,pyspark,delta lake,data