drjobs GCP Data engineer

GCP Data engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Chennai - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

  • Roles GCP Data Engineer
  • Location Chennai Hyderabad and Bengaluru
  • Work Mode 5Days
  • 100% WFO
  • Exp (Min)5yrs to (Max) 10yrs
  • NP Immediate to Currently Serving (Max) till Feb 20
Only Interested and Relevant Candiates can share their CVs to or whatsapp to
1. Responsibilities For Teradata
  • Data Migration and Integration Use tools like Dataflow BigQuery Data Transfer Service or custom pipelines for migrating data. Design and implement hybrid architectures integrating Teradata with GCP services. (e.g. BigQuery Cloud Storage Pub/Sub).
  • Use Teradata for highperformance data warehousing and analytics
  • Work on data migration or hybrid cloud setups where part of the data is processed in Teradata and part in GCP.
  • Optimize queries and performance tuning on Teradata clusters.
  • Ensure data processing efficiency including resource optimization and cost management within GCP.
  • Work with cloudnative tools like Dataflow to ingest and export data between Teradata and GCP.
Required Skills:
  • Proficiency in Teradata SQL query optimization and Teradata Vantage.
  • Familiarity with GCP services such as BigQuery Cloud Storage and Dataflow for ETL (Extract Transform Load).
  • Knowledge of data modeling and performance tuning in Teradata environments.
2. Responsibilities For Data Flow
  • Design implement and maintain Dataflow pipelines for data processing.
  • Optimize Dataflow pipelines for performance and cost efficiency.
  • Work with other teams to define data processing requirements and data models.
  • Ensure the reliability and scalability of data processing workflows.
  • Implement logging monitoring and errorhandling mechanisms in Dataflow pipelines.
  • Troubleshoot and resolve pipeline failures or performance issues.
Required Skills:
  • Experience in Apache Beam (used by Dataflow for pipeline creation).
  • Proficiency with data transformation processing and streaming technologies.
  • Familiarity with GCP services such as BigQuery Cloud Storage Pub/Sub and others

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.