drjobs GCP Cloud Data Engineer English

GCP Cloud Data Engineer

صاحب العمل نشط

1 وظيفة شاغرة
drjobs

حالة تأهب وظيفة

سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكتروني
Valid email field required
أرسل الوظائف
drjobs
أرسل لي وظائف مشابهة
drjobs

حالة تأهب وظيفة

سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكتروني

Valid email field required
أرسل الوظائف
موقع الوظيفة drjobs

Re - النرويج

الراتب الشهري drjobs

لم يكشف

drjobs

لم يتم الكشف عن الراتب

عدد الوظائف الشاغرة

1 وظيفة شاغرة

الوصف الوظيفي

Job Title: GCP Cloud Data Engineer
Location: Remote
Experience: 2 to 7 years
Job Description:
We are seeking a talented Cloud Data Engineer to join our team on a remote basis.
The ideal candidate will have a strong background in cloud migration building ETL pipelines data integrations and the development of Operational Data Stores (ODS) and Data Warehouses (DW).
If you are passionate about leveraging cuttingedge technologies to solve complex data challenges we want to hear from you.
Responsibilities:
Cloud Migration: Lead the migration of onpremises data systems to cloudbased solutions ensuring scalability reliability and efficiency.
ETL Pipeline Development: Design develop and maintain robust Extract Transform Load (ETL) pipelines to process large volumes of data from various sources into our data ecosystem.
Data Integration: Implement seamless integration between different data sources and platforms enabling unified access to critical business data.
Operational Data Store (ODS) and Data Warehouse (DW) Development: Architect and build ODS and DW solutions to support analytics reporting and decisionmaking processes.
Big Data Technologies: Utilize advanced technologies such as BigQuery Kafka Google Cloud Storage (GCS) and REST APIs to drive data engineering initiatives forward.
Performance Optimization: Continuously optimize data pipelines and storage solutions for improved performance reliability and costeffectiveness.
Collaboration: Work closely with crossfunctional teams including data scientists analysts and software engineers to understand data requirements and deliver solutions that meet business needs.
Requirements:
Bachelors degree or higher in Computer Science Engineering or a related field.
Proven experience in cloud data engineering with a focus on cloud migration ETL pipeline development and data integration.
Strong proficiency in cloud platforms such as Google Cloud Platform (GCP) particularly BigQuery Google Cloud Storage (GCS) and related services.
Handson experience with streaming data technologies like Kafka for realtime data processing and analysis.
Familiarity with RESTful APIs for integrating data from external sources into internal systems.
Solid understanding of data modeling concepts and experience with relational and nonrelational databases.
Excellent problemsolving skills and the ability to thrive in a fastpaced dynamic environment.
Strong communication skills with the ability to collaborate effectively with team members remotely.
Preferred Qualifications:
Experience with containerization technologies such as Docker and orchestration tools like Kubernetes.
Certification in Google Cloud Platform or relevant cloud technologies.
Knowledge of machine learning concepts and experience with ML pipeline development is a plus.
Familiarity with agile methodologies and DevOps practices for continuous integration and deployment (CI/CD).
Previous experience in a remote work environment or distributed team setup.

google cloud,cloud,ml,pipeline,google,integration,etl

نوع التوظيف

دوام كامل

نبذة عن الشركة

الإبلاغ عن هذه الوظيفة
إخلاء المسؤولية: د.جوب هو مجرد منصة تربط بين الباحثين عن عمل وأصحاب العمل. ننصح المتقدمين بإجراء بحث مستقل خاص بهم في أوراق اعتماد صاحب العمل المحتمل. نحن نحرص على ألا يتم طلب أي مدفوعات مالية من قبل عملائنا، وبالتالي فإننا ننصح بعدم مشاركة أي معلومات شخصية أو متعلقة بالحسابات المصرفية مع أي طرف ثالث. إذا كنت تشك في وقوع أي احتيال أو سوء تصرف، فيرجى التواصل معنا من خلال تعبئة النموذج الموجود على الصفحة اتصل بنا