Job Summary:
We are looking for a skilled GCP Data Engineer with over 68 years of experience in building optimizing and managing data pipelines and architectures using Google Cloud Platform (GCP).
Employment Type: Contractual
Experience : 6 10 Years
Key Skills : Kafka Apache Spark (SQL Scala Java) Python Hadoop Platform Hive AirFlow Cloud Composer PySpark
Key Responsibilities:
- Design and develop dataingestion frameworks realtime processing solutions and data processing and transformation frameworks leveraging open source tools and data processing frameworks.
- Operationalize open source dataanalytic tools for enterprise use.
- Ensure data governance policies are followed by implementing or validating data lineage quality checks and data classification.
- Understand and follow the company development lifecycle to develop deploy and deliver the solutions.
- Perform rootcause analysis on datarelated issues.
Preferred Qualifications:
- GCP certifications (e.g. Professional Data Engineer).
- Experience with data governance and DevOps practices.
HR contact
java,hive,hadoop,apache spark (sql, scala, java),hadoop platform,cloud composer,gcp,python scripting,kafka,python,pyspark,airflow,apache spark,scala,sql