Job: Databricks Administrator/Architect
Location: Raleigh NC
- ONLY SUBMIT CANDIDATES CURRENTLY LIVING IN THE RALEIGH/DURHAM/CHAPEL HILL NC AREA.
- The candidate must come onsite on the first day to collect equipment.
- All candidates must be local to the Triangle region of North Carolina and posting may require up to 12 days per month in a Triangle area office for meetings.
North Carolina Department of Transportation Database Team seeks a Databricks Administrator/Architect with proven skills for a 12month engagement for creation/tuning & support of the Databricks environment. This position will be responsible for developing and designing the Databricks environment at NCDITT. This individual will work with internal staff to plan/design/maintain the Databricks environment and recommend changes needed to accommodate/grow as our business needs dictate. This individual will facilitate changes through DITTs change process and work very closely with the DBA & Development Staff regarding all aspects of the design and planning of the Databricks environment.
Responsibilities:
- Provide mentorship guidance overall knowledge share and support to team members promoting continuous learning and development.
- Oversee the design implementation and maintenance of Databricks clusters.
- Ensure the platforms scalability performance and security.
- Provide escalated support and troubleshooting to users.
- Oversee maintenance of rolebased access to data and features in the Databricks Platform using Unity Catalog.
- Review clusters health check and best practices implementation.
- Review and maintain documentation for users and administrators.
- Design and implement tailored data solutions to meet customer needs and use cases spanning from ingesting data from APIs building data pipelines analytics and beyond within a dynamically evolving technical stack.
- Work on projects involving onprem data ingestion into Azure using ADF.
- Build data pipelines based on the medallion architecture that clean transform and aggregate data from disparate sources.
SKILL MATRIX:
- Extensive handson experience implementing Lakehouse architecture using Databricks Data Engineering platform SQL Analytics Delta Lake Unity Catalog Required 5 Years
- Strong understanding of Relational & Dimensional modeling Required 5 Years
- Demonstrate proficiency in coding skills Python SQL and PySpark to efficiently prioritize perf security scalability robust data integrations Required 6 Years
- Experience implementing serverless realtime/near realtime arch. using Cloud (i.e. Azure AWS or GCP Tech Stack) and Spark tech (Streaming & ML) Required 2 Years
- Experience Azure Infra config (Networking architect and build large data ingestion pipelines and conducting data migrations using ADF or similar tech Required 4 Years
- Experience working w/ SQL Server features such as SSIS and CDC Required 7 Years
- Experience with Databricks platform security features Unity Catalog and data access control mechanisms Required 2 Years
- Experience with GIT code versioning software Required 4 Years
- Databricks Certifications