NOTE: This position is not Remote (Candidate Open to Hybrid or Relocation to Temple Terrace FL 3 Days/week)
MUST HAVE: GCP and Big Data Development experience.
Please do not apply if you are not interested in hybrid and does not have Must Have skills.
Job Title: GCP Big Data Engineer
Duration: Up to 30 Months i.e. 2.5 Years (Including extensions)
Client: One of top in fortune 50 companies.
JOB DESCRIPTION :
Must have Good HandsOn on Big Data Development.
Artificial Intelligence and Data team is looking for BI engineers with expert level experience in developing enterprise software applications on Google Cloud Data Flow Engineer (34 Years Experience) Position Summary:
Seeking a skilled Google Cloud Data Flow Engineer with a proven track record in leveraging Google Cloud Platform (GCP) tools to design build and maintain scalable and reliable data solutions. The ideal candidate will possess deep expertise in GCP tools especially Dataflow BigQuery and Cloud Composer and will work collaboratively with crossfunctional teams to address datarelated technical challenges and enhance our data infrastructure.
Key Responsibilities:
1. Design build and deploy scalable and robust data pipelines using Google Cloud Dataflow.
2. Harness the capabilities of BigQuery for data analytics ensuring optimized performance and cost efficiency.
3. Manage workflow orchestration and automation using Cloud Composer.
4. Monitor troubleshoot and optimize data pipelines for performance ensuring data quality and integrity.
5. Stay updated with GCPs latest features and best practices to ensure the companys data infrastructure remains cuttingedge.
6. Document data architectures processes and data lineage for transparency and maintainability.
Qualifications:
1. 34 years of experience as a data engineer with significant exposure to the Google Cloud Platform.
2. Strong expertise in GCP tools particularly Dataflow BigQuery and Cloud Composer.
3. Strong GCP Dataflow Java/Python & beam skills Lead experience for delivery streams.
4. Experience in working with streaming / messaging systems like Kafka Pulsar GCP PubSub RabbitMQ and similar tools. Including connectors for systems like Cassandra.
5. Familiarity with other GCP services such as Cloud Storage and Dataproc..
6. Strong analytical and problemsolving skills.
7. Familiarity with other cloud platforms (e.g. AWS Azure) is a plus.
8. Excellent communication skills both written and verbal.
9. Bachelors degree in Computer Science Engineering or a related field.
Additional Requirements:
1. Demonstrated ability to work in a fastpaced environment managing multiple projects simultaneously.
2. Commitment to continuous learning and adapting to the rapidly evolving data landscape.
3. Proven ability to collaborate effectively with both technical and nontechnical stakeholders.