This is a remote position.
We seek a skilled Data Engineer to join our dynamic IT & Data Analytics team. In this role you will be responsible for designing implementing and optimizing data pipelines and analytics solutions on the Google Cloud Platform (GCP). You will work closely with crossfunctional teams to support datadriven decisionmaking within the organization.
Key Responsibilities:
- Develop and maintain data pipelines using GCP services such as BigQuery Cloud Run Cloud Functions Pub/Sub Dataflow and Cloud Composer.
- Write efficient SQL queries to manipulate and analyze large datasets.
- Utilize Python for data processing transformation and automation tasks.
- Implement data transformation workflows using DBT (Data Build Tool).
- Manage infrastructure as code using Terraform for efficient deployment and scaling.
- Collaborate with stakeholders to gather requirements and provide insights derived from data analytics.
- Communicate technical concepts and data insights effectively to technical and nontechnical audiences.
Requirements
- Proficient in GCP services especially BigQuery Cloud Run Cloud Functions Pub/Sub Dataflow and Cloud Composer.
- Strong SQL skills with experience in writing complex queries for data analysis.
- Proficient in Python for data manipulation and processing.
- Experience with DBT for data transformation and modeling.
- Familiarity with Terraform for infrastructure management.
- Excellent communication skills with the ability to explain technical concepts to a nontechnical audience.
- Strong problemsolving abilities and attention to detail.
- Ability to work collaboratively in a fastpaced environment.