drjobs GCP Data EngineerC2H

GCP Data EngineerC2H

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bangalore/Bengaluru - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

What are the key roles and responsibilities:

Delivering end to end Data Engineering projects in particular in geospatial & financial applications along with big data compute and cost optimization and application.

Develop complex ELT &ETL data pipelines with performance optimised data modelling.

Driving architectural and technical solutions to meet business nonfunctional and strategic requirements.

Data Transformation across data warehousing and data wrangling

Automate data to improve the speed and effectiveness of extracting analysing and using the data.
Google Cloud Ingestion

Experience on BigQuery Cloud Storage or equivalent cloud platforms

Knowledge of BigQuery ingress and egress patterns

Experience in writing Airflow DAGs


Knowledge of pubsubdataflow or any declarative data pipeline tools using batch and streaming ingestion

Other GCP Services: Vertex AI Model Registry Secret Manager KMS Composer KubeFlow Container Registry Artefact Registry Cloud Build Cloud Run OAuth2.0 Scheduler GKE Model Registry MIG Cloud Function Pub/Sub

Extensive experience in Google Cloud Platform (GCP) and related services (e.g. IAM BigQuery cloud storage functions compute).

Creating data models and standard patterns for big data (ingestion storage analytics etc.)
Data Warehousing (BigQuery) Strong SQL and data analysis skills Experience on BigQuery using partitions clustering arrays (atleast one level) and struct Understanding of Bigquery quotas service accounts and query types/patterns (append replace partition typesunnest aggregate to create arrays Analytic functions) Ability to write and optimise queries on BigQuery using best practises Experience in design and implementation of ELT or dataflow pipelines (using Google Dataflow or similar tool) Understanding of nested data sources (json) Exposure and understanding of bigdata processing Data Modelling experience to define target data warehouse model from source till business/reporting layer

Proficient ETL/ELT experience in workflows data modelling and data pipelines
Programming Strong python programming experience in using Google Cloud python client libraries Unix Shell scripting experience to automate operational activities
Visualisation Analytic visualisations through datastudio (or equivalent) Time series Box Plot Drill down. Experience in realtime analytics
Budget for: 8.7 LPA and 72000 PM

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.