We are seeking a talented and experienced Data Engineer with GCP to join our innovative team. In this role you will be responsible for designing implementing and maintaining scalable data solutions across cloud environments.
- Architect and develop robust scalable data pipelines and ETL processes in cloud environments
- Optimize data storage and retrieval systems for improved performance and costefficiency
- Collaborate with crossfunctional teams to understand data requirements and deliver effective solutions
- Implement and maintain data quality controls and monitoring systems
- Pipeline construction
- Design and develop APIs for data integration and accessibility
- Troubleshoot and resolve complex data engineering issues in distributed systems
- Stay uptodate with emerging technologies and best practices in cloud data engineering
- Contribute to the development of data governance policies and procedures
Qualifications :
- Bachelors degree in Computer Science or related field with 7 years of data engineering experience
- Strong proficiency in GCP big data technologies (e.g. Hadoop Spark) and advanced programming skills (Python Java or Scala)
- Expertise in data warehousing ETL tools and both SQL and NoSQL databases
- Experience with distributed systems microservices architecture and implementing scalable data pipelines
- Strong data modeling optimization and problemsolving skills
- Knowledge of data privacy security regulations and governance best practices
- Puls have knowledge of Terraform
- Nice to have knowledge of Azure
- Proficiency in English (written and spoken)
Additional Information :
The Devoteam Group works for equal opportunities promoting its employees based on merit and actively fights against all forms of discrimination. We are convinced that diversity contributes to the creativity dynamism and excellence of our organization. All of our vacancies are open to people with disabilities.
Remote Work :
Yes
Employment Type :
Fulltime