Job Title: Lead GCP Data Engineer
Client: Confidential
Location: Seattle WA. San Francisco CA. New York City NY. Boston MA
No of position : 5
Job Description:
Lead GCP Data Engineer with 9 to 12 years of experience to join our team. As a Lead GCP Data Engineer you will play a pivotal role in designing implementing and managing efficient data pipelines and solutions on Google Cloud Platform (GCP). You will work closely with clients to understand their requirements and implement solutions using best practices and industry standards.
Roles and Responsibilities:
- Collect and integrate data from various sources using efficient data pipeline design and implementation techniques.
- Utilize GCP services such as BigQuery Dataflow and Cloud Storage to store manage and process data efficiently ensuring data quality and security.
- Proficient in SQL and Python for data manipulation and analysis within the GCP ecosystem.
- Transform data as per enduser requirements using efficient ETL processes and algorithms.
- Leverage big data technologies and distributed processing frameworks to process and analyze massive amounts of data effectively.
- Design and build scalable and costeffective data solutions on cloud platforms optimizing resource utilization and performance.
- Work independently with clients effectively communicating requirements and implementing solutions that meet their needs.
- Serve as an individual contributor providing leadership and guidance to junior team members.
- Demonstrate strong communication and collaboration skills working effectively with crossfunctional teams.
- Adapt to agile methodologies and contribute to agile practices within the team.
Requirements:
- Bachelors degree in Computer Science Engineering or related field; advanced degree preferred.
- 9 to 12 years of experience in data engineering with a focus on Google Cloud Platform (GCP) technologies.
- Proficiency in GCP services such as BigQuery Dataflow Cloud Storage and related tools.
- Strong knowledge of data pipeline design and implementation ETL processes and data transformation algorithms.
- Experience with big data technologies and distributed processing frameworks (e.g. Hadoop Spark).
- Excellent programming skills in SQL and Python for data manipulation and analysis.
- Demonstrated ability to work independently with clients effectively communicating requirements and implementing solutions.
- Strong communication and collaboration skills with the ability to work effectively in a team environment.
- Familiarity with agile methodologies and practices.
Data Engineering,Google Cloud Platform