- Roles GCP Data Engineer
- Location Chennai Hyderabad and Bengaluru
- Work Mode 5Days
- 100% WFO
- Exp (Min)5yrs to (Max) 10yrs
- NP Immediate to Currently Serving (Max) till Feb 20
Only Interested and Relevant Candiates can share their CVs to or whatsapp to
1. Responsibilities For Teradata
- Data Migration and Integration Use tools like Dataflow BigQuery Data Transfer Service or custom pipelines for migrating data. Design and implement hybrid architectures integrating Teradata with GCP services. (e.g. BigQuery Cloud Storage Pub/Sub).
- Use Teradata for highperformance data warehousing and analytics
- Work on data migration or hybrid cloud setups where part of the data is processed in Teradata and part in GCP.
- Optimize queries and performance tuning on Teradata clusters.
- Ensure data processing efficiency including resource optimization and cost management within GCP.
- Work with cloudnative tools like Dataflow to ingest and export data between Teradata and GCP.
Required Skills:
- Proficiency in Teradata SQL query optimization and Teradata Vantage.
- Familiarity with GCP services such as BigQuery Cloud Storage and Dataflow for ETL (Extract Transform Load).
- Knowledge of data modeling and performance tuning in Teradata environments.
2. Responsibilities For Data Flow
- Design implement and maintain Dataflow pipelines for data processing.
- Optimize Dataflow pipelines for performance and cost efficiency.
- Work with other teams to define data processing requirements and data models.
- Ensure the reliability and scalability of data processing workflows.
- Implement logging monitoring and errorhandling mechanisms in Dataflow pipelines.
- Troubleshoot and resolve pipeline failures or performance issues.
Required Skills:
- Experience in Apache Beam (used by Dataflow for pipeline creation).
- Proficiency with data transformation processing and streaming technologies.
- Familiarity with GCP services such as BigQuery Cloud Storage Pub/Sub and others