What you'll do:
- You'll be developing a strong understanding of business requirements; working with business users to define data and reporting requirements.
- You'll work with Data ETL Engineer to design and develop Enterprise Data Warehouse and Data Marts.
- You'll develop ETL data pipelines to build Enterprise Data Models for Property, Agent, Broker, office and other master entities.
- You'll work with EDW team to develop ETLs for Facts & Dimensions.
- You'll be responsible to design and develop report datasets using Snowflake.
- You'll be responsible creating integration patterns/framework from Data Lake to Data Warehouse and from Data Warehouse to Reports.
- You'll be responsible for ensuring data quality and data integrity in Data Warehouse and Reporting Platform.
EDUCATION AND EXPERIENCE/SPECIAL SKILLS/TECHNOLOGIES/TOOLS REQUIREMENTS
Bachelor's in Computer Science, Engineering, or related technical discipline or equivalent combination of training and experience.
- 5+ years' experience in SQL (Expert level of SQL experience)
- 5+ years' experience in building ETLs using SQL, and Python
- 3+ years' experience in designing Enterprise Data Warehouse using Cloud Platform such as Snowflake
- 5+ Year of experience design, development and implementation of Enterprise Data Warehouse using traditional tools.
- 5+ Year of experience developing Enterprise Data Models with at least one enterprise modeling tool and data integration platform (Erwin, Embarcadero, Informatica, Talend, SSIS, DataStage, Pentaho). Embarcadero is preferred.
- 5 years of experience in design, development and implementation of ETL pipelines to load data to Data Warehouse. Performance tuning of queries by maintaining aggregates, compression, partition, query plans for ETLs and Reporting SQL Queries
- Excellent written and verbal communication skills in English