Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailResponsibilities
1. Data Architecture
Collaborate with data architects to design and develop Snowflake data models and schemas.
Create and maintain a wellstructured data warehouse and data lake architecture.
2. Data Integration
Develop ETL (Extract Transform Load) processes to ingest data from various sources into Snowflake.
Ensure data integration processes are efficient reliable and scalable.
Design and implement data pipelines using Snowflake features like tasks and streams.
3. Performance Optimization
Optimize query performance by creating and maintaining appropriate indexes materialized views and clustering keys.
Identify and resolve performance bottlenecks in data processing.
4. SQL Development
Write complex SQL queries stored procedures and userdefined functions (UDFs) to support data analytics and reporting needs.
Ensure SQL code follows best practices for readability and performance.
5. Security and Access Control
Implement and manage security measures including rolebased access control (RBAC) and data encryption to protect sensitive data.
Audit and monitor data access and user activities.
6. Data Quality Assurance
Define and implement data quality checks and validation processes to maintain data accuracy.
Establish data quality rules and alerts to proactively identify issues.
7. Documentation
Create and maintain technical documentation for data models ETL processes and data dictionaries.
Document best practices standards and guidelines for Snowflake development.
8. Version Control and Deployment
Use version control systems (e.g. Git) for managing Snowflake SQL scripts and objects.
Coordinate the deployment of changes to Snowflake environments.
9. Monitoring and Alerts
Set up monitoring and alerting for Snowflake environments to proactively detect and respond to issues.
Troubleshoot and resolve incidents related to data processing and performance.
10. Backup and Recovery
Implement backup and recovery strategies to ensure data integrity and availability.
Develop and test data recovery procedures.
11. Collaboration
Collaborate with data engineers data scientists and business analysts to understand data requirements and provide data solutions.
Work with crossfunctional teams to support datarelated projects and initiatives.
Qualifications
Bachelors or masters degree in computer science data engineering or a related field.
7 years of experience as a Snowflake developer or data engineer with a focus on data warehousing and ETL.
Snowflake certification(s) is a plus.
Strong SQL skills and proficiency in data modeling and database design.
Knowledge of cloud data warehousing concepts and best practices.
Familiarity with data integration tools and technologies.
Solid understanding of data governance data security and compliance requirements.
Experience with version control systems and deployment processes.
Excellent problemsolving and troubleshooting skills.
Strong communication and collaboration abilities.
Ability to work in an Agile or iterative development environment.
Skill
Required / Desired
Snowflake Data Warehouse Design & Development
Required
Cloud Computing Preferably AWS (Azure GCP)
Required
Data Modeling (Star Schema Snowflake Schema)
Required
Continuous Integration/Deployment (CI/CD) Tools
Required
Performance Tuning & Optimization
Required
SQL for Data Querying & Manipulation
Required
Data Migration to Cloud Platforms
Required
Data Security & Governance in Cloud Environments
Required
Data Warehousing Best Practices
Required
Automation using Python/SnowSQL
Required
ETL/ELT Processes using Snowflake
Required
Snowflake Stored Procedures UDFs and Views
Required
Working with Snowflake Utilities (SnowPipe Streams Tasks)
Required
Data Integration (Informatica Talend Matillion DBT Preferable)
Highly desired
Bachelors Degree minimum
Required
Full Time