Job Title: Snowflake Data Engineer
About the Role:
The Snowflake Data Engineer will be a key member of the CAESAR (Crum Aggregate Enterprise
Store for Analytics & Reporting) team responsible for building optimizing and managing data
pipelines in the Snowflake cloud data platform. The ideal candidate will work closely with data
analysts stakeholders and developers to design and implement scalable data solutions that
support Crum & Forster s growing enterprise data needs.
Key Responsibilities:
Data Pipeline Development: Design build and maintain efficient scalable ETL/ELT
processes using Snowflake integrating data from various sources into the CAESAR data
repository.
Snowflake Data Management: Set up and optimize Snowflake environments ensuring
best practices in performance security and cost management.
Data Modeling: Develop and maintain data models (e.g. star snowflake schema) that
align with business reporting and analytics needs.
Data Integration: Collaborate with the CAESAR team to integrate data from multiple
internal and external systems ensuring data consistency accuracy and availability.
Performance Tuning: Monitor troubleshoot and optimize data load times and query
performance in Snowflake.
Collaboration: Work closely with stakeholders and the CAESAR team to gather
requirements understand data needs and deliver insights via dashboards reports and
data models.
Data Governance and Security: Implement best practices for data governance
including access control encryption and compliance with data privacy regulations.
Requirements
Location: Chicago/New Jersey
Type: Full Time
Qualifications:
Experience:
o 3 years of experience in data engineering with a focus on cloudbased data
warehousing.
o Handson experience working with Snowflake including managing Snowflake
environments writing complex SQL and optimizing performance.
o Experience with ETL/ELT tools (e.g. Matillion Talend DBT) and data pipeline
automation.
Technical Skills:
o Proficiency in SQL SnowSQL and working knowledge of programming
languages (e.g. Python Java) for data processing.
o Familiarity with cloud platforms like AWS Azure or GCP.
o Experience with data modeling techniques (star schema snowflake schema)
and database design.
Data Tools: Experience with BI tools such as Tableau Power BI or similar platforms.
Soft Skills: Strong communication and collaboration skills ability to work with crossfunctional teams.
Preferred Qualifications:
Certifications: Snowflake certification or relevant cloud data certifications (AWS
Azure).
Industry Knowledge: Experience working in the insurance or financial services industry
is a plus.
Agile Methodology: Familiarity with Agile development processes and tools (e.g. Jira).
Location: Chicago/New Jersey Type: Full Time Qualifications: Experience: o 3+ years of experience in data engineering, with a focus on cloud-based data warehousing. o Hands-on experience working with Snowflake, including managing Snowflake environments, writing complex SQL, and optimizing performance. o Experience with ETL/ELT tools (e.g., Matillion, Talend, DBT) and data pipeline automation. Technical Skills: o Proficiency in SQL, SnowSQL, and working knowledge of programming languages (e.g., Python, Java) for data processing. o Familiarity with cloud platforms like AWS, Azure, or GCP. o Experience with data modeling techniques (star schema, snowflake schema) and database design. Data Tools: Experience with BI tools such as Tableau, Power BI, or similar platforms. Soft Skills: Strong communication and collaboration skills, ability to work with crossfunctional teams. Preferred Qualifications: Certifications: Snowflake certification or relevant cloud data certifications (AWS, Azure). Industry Knowledge: Experience working in the insurance or financial services industry is a plus. Agile Methodology: Familiarity with Agile development processes and tools (e.g., Jira).