drjobs Senior GCP

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bolingbrook - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

The GCP Data Engineer is responsible for Analysis Design Development Testing and Deployment support for new framework enhance existing framework that enables the data platform data pipelines. This position is required to perform independently in a highly dynamic and fast paced environment. The person will work alongside Architects engineers analysts and PMs to deliver scalable robust innovative technical solutions. This position plays a key role in building realtime and batch data ingestion egress frameworks streaming analytics framework and support AI platform. The person must have similar experience in a prior job.

RESPONSIBILITIES

Build frameworks for largescale data processing evaluating appropriate emerging technologies and approaches that will power datadriven capabilities across the Enterprise.

Develops data solutions on Google Cloud Platform leveraging Google Data Flow Data Proc Composer Pub/Sub BigQuery GCS Cloud function define workflows scheduled and event driven workloads to ingest data from internal/external partners data distribution channels etc.

Build features using Python in Spark leveraging GCPs Spark Engine (Dataproc) SQL/GSQL on Google BQ.

Build and maintain scalable data pipelines to handle highvolume data (e.g. 150 million rows).

Collaborate with crossfunctional technologists across the organization to gather requirements solve new problems and deliver quality results.

Coordinate with offshore engineers to get projects/tasks completed.

Develops and executes test plans to validate the implementation and performance of frameworks and recommend performance improvements.

Supports the operations of the deployed solutions investigates complex issues and assists with the resolution and implementation of preventive measures.

Required Skills : REQUIREMENTS FOR CONSIDERATION: o Bachelors degree in Computer Science Engineering or a related field. o 8 years of experience in data engineering with a focus on GCP technologies. o Proficiency in Google Cloud Platform leveraging Google Data Flow Data Proc Composer Pub/Sub BigQuery GCS Cloud function define workflows scheduled and event driven workloads processing. o Handson experience and solid knowledge in building and maintaining endtoend data pipelines using Python and Spark (GCP Dataproc) or any other GCP services o Strong experience with BigQuery including SQL and Stored Procedures. o Experience with highvolume data processing (GBlevel). o Previous experience working in a similar role with GCP Services is a must. o Experience in data quality continuous integrations build and deployment processes using GitHub Jenkins and Unix/Linux shell scripts. o Proactive and able to catch issues before failures. o Possess a strong work ethic; takes pride in producing a quality product and a strong team player o Work with production support and project consultants in an onshore / offshore model o Support offhours platform issues and code deployments as needed

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.