Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailPRINCIPAL DATA ENGINEER
Experiance : 1520 years
Required Skills:Cloud Data Engineering GCP Big Query CloudRun Dataform SQLPython Airflow
Pubsub Kubernetes Docker DATAOPS Grafana Datadog.
Role Overview
We are seeking a dynamic and highly skilled Principal Data Engineer who has extensive experience
building enterprise scale data platforms and lead these foundational efforts. This role demands
someone who not only possesses a profound understanding of the data engineering landscape but
is also at the forefront of their game. The ideal candidate will contribute significantly to platform
development with diverse skillset while also being very handson coding and actively shaping the
future of our data ecosystem.
Responsibilities:
As a principal engineer you will be responsible for ideation architecture design and
development of new enterprise data platform. You will collaborate with other cloud and
security architects to ensure seamless alignment within our overarching technology
strategy.
Architect and design core components with a microservices architecture abstracting
platform and infrastructure intricacies.
Create and maintain essential data platform SDKs and libraries adhering to industry best
practices.
Design and develop connector frameworks and modern connectors to source data from
disparate systems both onprem and cloud.
Design and optimize data storage processing and querying performance for largescale
datasets using industry best practices while keeping costs in check.
Architect and design the best security patterns and practices
Design and develop data quality frameworks and processes to ensure the accuracy and
reliability of data.
Collaborate with data scientists analysts and cross functional teams to design data
models database schemas and data storage solutions.
Design and develop advanced analytics and machine learning capabilities on the data
platform.
Design and develop observability and data governance frameworks and practices.
Stay up to date with the latest data engineering trends technologies and best practices.
Drive the deployment and release cycles ensuring a robust and scalable platform.
Requirements:
15 of proven experience in modern cloud data engineering broader data landscape
experience and exposure and solid software engineering experience.
Prior experience architecting and building successful enterprise scale data platforms in
a green field environment is a must.
Proficiency in building end to end data platforms and data services in GCP is a must.
Proficiency in tools and technologies: BigQuery Cloud Functions Cloud Run Dataform
Dataflow Dataproc SQL Python Airflow PubSub.
Experience with Microservices architectures Kubernetes Docker and Cloud Run
Experience building Symantec layers.
Proficiency in architecting and designing and development experience with batch and
real time streaming infrastructure and workloads.
Solid experience with architecting and implementing metadata management including
data catalogues data lineage data quality and data observability for big data workflows.
Handson experience with GCP ecosystem and data lakehouse architectures.
Strong understanding of data modeling data architecture and data governance
principles.
Excellent experience with DataOps principles and test automation.
Excellent experience with observability tooling: Grafana Datadog.
Nice to have:
Experience with Data Mesh architecture.
Experience building Semantic layers for data platforms.
Experience building scalable IoT architectures
Full Time