Job Title: Big Data/Pyspark Engineer
Location : Zurich Switzerland
Job Description:
- 10 years of IT experience with indepth big data technology
- Conceptualize and design big data system involving Hadoop PySpark Hive.
- Extensively worked on Python and pyspark Programming
- Experience in building complex data pipelines using Spark and especially PySpark
- Experience in performance tuning Spark application.
- Experience in scheduling and running spark applications.
- Experience in Airflow Databricks & Azure is added plus
- Design functional and technical architectures
- Develop applications on the big data platform using open source programming languages
- Handson experience with fundamental Hadoop tools and technologies
- Work closely with Administrator Architect and Developers
- Knowledge on Cluster management and storage mechanism in Big Data Cloud
Essential Skills:
- Strong analytical skills
- Strong framework design skills
- Strong code standardization skills