Overview
The GCP Data Engineer plays a pivotal role in our organization by managing and optimizing data flows from multiple sources into the Google Cloud Platform. This position is integral for developing data processing solutions that empower our analytics team to derive insights swiftly. The GCP Data Engineer ensures the reliability availability and security of data enabling datadriven decisionmaking across the company. As part of a collaborative team the engineer will work closely with data scientists analysts and stakeholders to gather requirements design efficient pipelines and support various data initiatives. Positioned at the intersection of technology and analytics this role is crucial for maintaining the competitive edge of our data operations in an increasingly datacentric world.
Key Responsibilities
- Design and implement scalable data pipelines on Google Cloud Platform (GCP).
- Develop ETL processes to integrate data from various sources into BigQuery.
- Optimize SQL queries for performance improvements.
- Maintain and troubleshoot existing data infrastructure.
- Collaborate with data analysts and stakeholders to gather business requirements.
- Monitor data quality and ensure data integrity across platforms.
- Create and manage Cloud Storage and databases in GCP.
- Perform data modeling and schema design to support reporting needs.
- Document data flow processes and architecture designs.
- Implement data visualization solutions as required by stakeholders.
- Participate in code reviews and ensure adherence to best practices.
- Evaluate new GCP services and tools for data engineering solutions.
- Train and mentor junior engineers in data practices.
- Collaborate with DevOps to manage cloud infrastructure.
- Conduct performance tuning of data processes and queries.
Required Qualifications
- Bachelors degree in Computer Science Engineering or related field.
- 7 years of experience in data engineering or related role.
- Strong experience with Google Cloud Platform (GCP).
- Proficiency in BigQuery and Cloud Storage.
- Solid foundations in SQL and Python coding.
- Experience with ETL tools and techniques.
- Knowledge of data modeling concepts and database design.
- Familiarity with data visualization tools (e.g. Tableau Looker).
- Understanding of data privacy and security regulations.
- Excellent problemsolving skills and attention to detail.
- Strong communication skills and ability to work in a team.
- Experience with version control systems preferably Git.
- Knowledge of machine learning concepts is a plus.
- Ability to handle multiple tasks in a fastpaced environment.
- A background in Agile methodologies is advantageous.
etl tools,sql proficiency,python,data visualization,data privacy and security regulations,google cloud platform,bigquery,git,data modeling,google cloud platform (gcp),cloud storage,version control systems,data visualization tools,data flow,sql,database design,machine learning concepts,etl