Minimum of 4 years of experience in cloud data engineering data landscape and software engineering.
Previous experience in architecting and building largescale data platforms in a green field environment.
Expertise in building endtoend data platforms and data services in GCP.
Proficient in using tools and technologies such as BigQuery Cloud Functions Cloud Run Dataform Dataflow Dataproc SQL Python Airflow PubSub.
Experience with Microservices architectures including Kubernetes Docker and Cloud Run.
Experience in building Symantec layers.
Skilled in architecting designing and developing both batch and realtime streaming infrastructure and workloads.
Solid experience in architecting and implementing metadata management including data catalogues data lineage data quality and data observability for big data workflows.
Handson experience with GCP ecosystem and data lakehouse architectures.
Strong understanding of data modeling data architecture and data governance principles.
Excellent experience with DataOps principles and test automation.
Excellent experience with observability tooling such as Grafana Datadog.