drjobs GCP Data Engineer العربية

GCP Data Engineer

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Mon - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

Position: GCP Data Engineer

Location: AR Remote

Duration: 12 Months


Client is looking for a strong developer to build ML pipelines on Kubeflow connecting components from BigQuery Google Cloud Storage and Dataproc and deploying API end points. The ideal candidate should be an expert in leading projects in developing and testing data pipelines data analytics efforts proactive issue identification and resolution and alerting mechanism using traditional new and emerging technologies.

  • Assembling large to complex sets of data that meet nonfunctional and functional requirements
  • Identifying designing and implementing internal process improvements including redesigning infrastructure for greater scalability optimizing data delivery and automating manual processes
  • Building required infrastructure for optimal extraction transformation and loading of data from various data sources
  • Building analytical tools to utilize the data pipeline providing actionable insight into key business performance metrics including operational efficiency and customer acquisition
  • Working with stakeholders including data design product and executive teams and assisting them with datarelated technical issues
  • Working with stakeholders including the Executive Product Data and Design teams to support their data infrastructure needs while assisting with datarelated technical issues
  • work on a multivendor multigeo team support one of the complex enterprise environments

Mandatory Skill sets
The candidate will be expected to build ML pipelines on Kubeflow connecting components from BigQuery Google Cloud Storage and Dataproc and deploy API end points.

Technology expectations

  • 3 years of experience in BigQuery
  • 3 years of experience in Kubeflow and Dataproc
  • 3 years of handson experience in API deployment and 1 years of experience in Kubernetes
  • 3 experience in GCP environment
  • Good knowledge on DevOps background and Python
  • Strong verbal and written communication skills to effectively share findings with shareholders.
  • Good understanding of webbased application development
  • Should be an independent contributor from day one and be able to operate with minimal to no supervision
  • A bachelors degree in computer science or relevant fields is mandatory.
  • Should be hands on and be able to work in an agile fastmoving environment.


Must Have Skills
The candidate will be expected to build ML pipelines on Kubeflow connecting components from BigQuery Google Cloud Storage and Dataproc and deploy API end points.

  • Technology expectations o 3 years of experience in BigQuery
  • 3 years of experience in Kubeflow and Dataproc
  • 3 years of handson experience in API deployment and 1 years of experience in Kubernetes
  • 3 experience in GCP environment
  • Good knowledge on DevOps background and Python

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.