Role : Senior Data Engineer
Location : Glendale CA (Hybrid Onsite 34 days per week)
Type : Contract
Must Haves:
- Airflow and Spark
- Snowflake or Databricks
- Data Modeling
Required Skills
- 5 years of data engineering experience developing large data pipelines
- Proficiency in at least one major programming language (e.g. Python Java Scala)
- Strong SQL skills and ability to create queries to analyze complex datasets
- Handson production environment experience with distributed processing systems such as Spark
- Handson production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
- Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake Databricks Big Query).
- Experience in developing APIs with GraphQL
- Deep Understanding of AWS or other cloud providers as well as infrastructure as code
- Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
- Strong algorithmic problemsolving expertise
Preferred Qualifications: Previous experience working with Power BI or other analytics visualization tool