Role: Big Data Developer
Term: 6 months CTH
Location: Hybrid 3x/week on site to office (must be local to a Dallas TX Hartford CT Buffalo Grove IL New York Woonsocket RI or Scottsdale AZ)
Updates:
Required Skills
- 6 years experience in Big Data using tools in Hadoop or GCP platforms.
- 2 years experience in BigQuery and Python. Knowledge in Tableau Elastic Search or MongoDB Atlas is preferable.
Job Description
- A Sr. Big Data developer is responsible for designing implementing and optimizing large scale data solutions using Google BigQuery and Python.
Key responsibility will include:
- Rewriting code from Big Integrate to Big Query as part of Hadoop 2.6 to GCP Migration
- Query optimization Optimize queries for performance and cost efficiency within BigQuery
- Pipeline Management Building data pipeline to move and transform data between different systems.
- Implementing and maintaining data security measures within BigQuery to ensure the compliance and contractual data privacy needs
- Support the Elastic Search to MongoDB atlas migration