Job Title: Enterprise Data Warehouse Scalable Engineer
Location: Princeton NJ (Hybrid)
Duration: 6 Months
Job Description:
We are seeking a skilled Enterprise Data Warehouse Scalable Engineer with expertise in Hadoop and Python to join our team. The ideal candidate will have the following qualifications:
Required Skills and Experience:
- Over 5 years of experience with DBMS RDBMS and ETL methodologies.
- Experience in designing and implementing automated scalable architectures in an enterprise environment.
- Proficiency in SQL with strong database design knowledge and experience working with largescale data volumes.
- Programming expertise in Python and PySpark.
- Familiarity with the Hadoop ecosystem (HDFS Spark Oozie).
- Strong understanding of data warehousing principles ETL processes and dimensional data modeling.
- Excellent problemsolving and troubleshooting skills.
- Knowledge of Apache Airflow is a plus.
- Bachelors Masters or PhD in Computer Science Engineering or a related technology field.
Preferred Skills:
- Experience with MPP (Massively Parallel Processing) systems.
- Familiarity with streaming technologies such as Kafka.