drjobs Sr GCP Data Engineer W2 -Hybrid

Sr GCP Data Engineer W2 -Hybrid

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Dearborn, MI - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

We are looking out for GCP Data Engineer in Dearborn MI (Hybrid) for 12 months contract please go through below job description and if interested please share your resume to ASAP.

Job Title: Specialty Development Consultant/Expert GCP Data Engineer W2 Position

Location: Dearborn MI Hybrid

Duration: 12 Months

Client: Ford Motors

Position Description:

Materials Management Platform (MMP) is a multiyear transformation initiative aimed at transforming Fords Materials Requirement Planning & Inventory Management capabilities. This is part of a larger Industrial Systems IT Transformation effort. This position responsibility is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development Manufacturing Finance Purchasing NTier Supply Chain Supplier Collaboration

Skills Required:

  • Design and implement datacentric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query Google Cloud Storage Cloud SQL Memory Store Dataflow Dataproc Artifact Registry Cloud Build Cloud Run Vertex AI Pub/Sub GCP APIs.
  • Build ETL pipelines to ingest the data from heterogeneous sources into our system
  • Develop data processing pipelines using programming languages like Java and Python to extract transform and load (ETL) data
  • Create and maintain data models ensuring efficient storage retrieval and analysis of large datasets
  • Deploy and manage databases both SQL and NoSQL such as Bigtable Firestore or Cloud SQL based on project requirements
  • Optimize data workflows for performance reliability and costeffectiveness on the GCP infrastructure.
  • Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments.
  • Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures
  • Troubleshoot and resolve issues related to data processing storage and retrieval.
  • Promptly address code quality issues using SonarQube Checkmarx Fossa and Cycode throughout the development lifecycle
  • Implement security measures and data governance policies to ensure the integrity and confidentiality of data
  • Collaborate with stakeholders to gather and define data requirements ensuring alignment with business objectives.
  • Develop and maintain documentation for data engineering processes ensuring knowledge transfer and ease of system maintenance.
  • Participate in oncall rotations to address critical issues and ensure the reliability of data engineering systems.
  • Provide mentorship and guidance to junior team members fostering a collaborative and knowledgesharing environment.

Experience Required:

  • 8 years of professional experience in: Data engineering data product development and software product launches; At least three of the following languages: Java Python Spark Scala SQL and experience performance tuning.
  • 4 years of cloud data/software engineering experience building scalable reliable and costeffective production batch and streaming data pipelines using:
    • Data warehouses like Google BigQuery.
    • Workflow orchestration tools like Airflow.
    • Relational Database Management System like MySQL PostgreSQL and SQL Server.
    • RealTime data streaming platform like Apache Kafka GCP Pub/Sub
    • Microservices architecture to deliver largescale realtime data processing application.
    • REST APIs for compute storage operations and security.
    • DevOps tools such as Tekton GitHub Actions Git GitHub Terraform Docker.
    • Project management tools like Atlassian JIRA
  • Automotive experience is preferred
  • Support in an onshore/offshore model is preferred
  • Excellent at problem solving and prevention.
  • Knowledge and practical experience of agile delivery

Experience Preferred:

  • Experience in IDOC processing APIs and SAP data migration projects.
  • Experience working in SAP S4 Hana environment

Education Required:

  • Requires a bachelors or foreign equivalent degree in computer science information technology or a technology related field

Additional Information :

GCP Certification preferred Ford Experience preferred Hybrid with up to 4 days a week on site.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.