drjobs GCP Cloud Data Engineer العربية

GCP Cloud Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Re - Norway

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Title: GCP Cloud Data Engineer
Location: Remote
Experience: 2 to 7 years
Job Description:
We are seeking a talented Cloud Data Engineer to join our team on a remote basis.
The ideal candidate will have a strong background in cloud migration building ETL pipelines data integrations and the development of Operational Data Stores (ODS) and Data Warehouses (DW).
If you are passionate about leveraging cuttingedge technologies to solve complex data challenges we want to hear from you.
Responsibilities:
Cloud Migration: Lead the migration of onpremises data systems to cloudbased solutions ensuring scalability reliability and efficiency.
ETL Pipeline Development: Design develop and maintain robust Extract Transform Load (ETL) pipelines to process large volumes of data from various sources into our data ecosystem.
Data Integration: Implement seamless integration between different data sources and platforms enabling unified access to critical business data.
Operational Data Store (ODS) and Data Warehouse (DW) Development: Architect and build ODS and DW solutions to support analytics reporting and decisionmaking processes.
Big Data Technologies: Utilize advanced technologies such as BigQuery Kafka Google Cloud Storage (GCS) and REST APIs to drive data engineering initiatives forward.
Performance Optimization: Continuously optimize data pipelines and storage solutions for improved performance reliability and costeffectiveness.
Collaboration: Work closely with crossfunctional teams including data scientists analysts and software engineers to understand data requirements and deliver solutions that meet business needs.
Requirements:
Bachelors degree or higher in Computer Science Engineering or a related field.
Proven experience in cloud data engineering with a focus on cloud migration ETL pipeline development and data integration.
Strong proficiency in cloud platforms such as Google Cloud Platform (GCP) particularly BigQuery Google Cloud Storage (GCS) and related services.
Handson experience with streaming data technologies like Kafka for realtime data processing and analysis.
Familiarity with RESTful APIs for integrating data from external sources into internal systems.
Solid understanding of data modeling concepts and experience with relational and nonrelational databases.
Excellent problemsolving skills and the ability to thrive in a fastpaced dynamic environment.
Strong communication skills with the ability to collaborate effectively with team members remotely.
Preferred Qualifications:
Experience with containerization technologies such as Docker and orchestration tools like Kubernetes.
Certification in Google Cloud Platform or relevant cloud technologies.
Knowledge of machine learning concepts and experience with ML pipeline development is a plus.
Familiarity with agile methodologies and DevOps practices for continuous integration and deployment (CI/CD).
Previous experience in a remote work environment or distributed team setup.

google cloud,cloud,ml,pipeline,google,integration,etl

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.