Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via email10 yrs Extensive working knowledge and expertise in Spark on Scala or PySpark and Hive.
Experience in Design development and performance tuning in Spark.
Strong programming ss in Java or Scala or Python
Familiarity with big data processing tools and techniques.
Experience with the Hadoop ecosystem
Good understanding of distributed systems
Should have working knowledge in RDMS Data warehouses and Unix S scripting.
Excellent ytical and problemsolving ss. Familiarity with tools like Airflow Ctrl M.
Remote Work :
No
Full Time