drjobs GCP Data Analyst العربية

GCP Data Analyst

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Us - France

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

Top Skills Details

Extensive knowledge of data principles patterns processes and practices
Databricks
SQL
Python
Google Cloud Platform (GCP) understanding (especially Dataflow Pub/Sub Composer)
Tech skills aside candidates should be excellent communicators and be comfortable working remotely

Secondary Skills Nice to Haves

Job Description

Team


It was stressed to us today that 6/24 or 6/26 at the latest would need to be the start dates so that means we need to look at candidates who are immediately available or who can put in a 2 week notice to start on 6/26 @ the latest.

So the first 2 needs:

1) GCP Tech Lead / Data Engineer (Proven experience validated on resume/references where he/she led other team members)
2) GCP Data Analyst


As mentioned this sits in the Supply Chain space. The manager provides a great level of detail below about the scope of work business challenge theyre attempting to solve & the tech stack associated.

Kroger Supply Chain Initiative

Description of the project: End to End (E2E) Fresh Cold Chain Data Team

What are the primary goals and deliverables expected from this project:
Team will be part of the E2E Fresh domain. E2E Fresh is responsible for perishable products like produce meat seafood and dairy. Specifically this team will be responsible for data related to refrigerated supply chain or Cold Chain as we call it. Cold Chain data includes data from temperature sensors purchase orders warehousing and transportation routing and location. The overall goal is to use this data increase freshness of food at the stores.
Ingestion pipelines for data from internal and external applications.
Curation and presentation of application data to internal consumers.
Join data from multiple systems to create Fresh Domain Data.
Investigate and mitigate data quality issues.

The journey of data enabling E2E Fresh will usually follow a path of Analytics to discover improvement opportunities Reporting to allow operations to target improvement and Event (realtime) Data to participate in operations. A hypothetical example would be:
Analytics reveals that strawberries are frequently subjected to high temperatures in transit from distribution center to stores. Management knows they have a problem and assigns operations to investigate.
Reports are given to operations managers to help them investigate and try an operational change like a temp check when the truck leaves the yard. Reports allow them to monitor and evaluate the effectiveness of the change on strawberry freshness.
The changes might mitigate some instances but not all. So transportation and/or stores are given real time data. This will allow them to be alerted that a truck is en route to the store with a high temperature. They turn the truck around before it even hits the stores loading dock and a replacement order is processed. This saves employee time and speeds up the replenishment process resulting in fresher strawberries in the store.

What is the start date and expected duration: ASAP start (ideally 6/24). Team will be intact for the foreseeable future as this is a long term initiative.

Are there key milestones or deadlines that the team should be aware of: KTD uses the OKR management model and for this upcoming quarter there is some discovery and ingestion. But typically the deliverables will be: integration pipelines of data sources crossdomain data set creation presentation of data via data lake or API.

Any specific location requirements: Team members will be fully remote and can be anywhere in the US as long as they are OK to work EST. Team will keep Eastern Time hours.

What specific technical skills and tools should the candidates possess:
Extensive knowledge of data principles patterns processes and practices
Databricks
SQL
Python
Google Cloud Platform (GCP) understanding (especially Dataflow Pub/Sub Composer)
Tech skills aside candidates should be excellent communicators and be comfortable working remotely

Team will have dedicated Product manager and be directly supported by scrum master solution architect and data architect.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.