Role Responsibilities:
As a Snowflake Data Architect, you will be responsible for designing and implementing scalable, secure, and efficient data architecture solutions leveraging Snowflake. Your key responsibilities include:
-
Data Architecture Design and Implementation
- Design end-to-end data architecture solutions using Snowflake.
- Create scalable, secure, and resilient data models and pipelines.
- Develop strategies for data acquisition, storage, processing, and integration.
- Design data models (e.g., star and snowflake schemas) for optimal performance.
-
Data Warehousing and Pipeline Optimization
- Implement data warehousing best practices, including data partitioning, clustering, and indexing.
- Design and develop ETL/ELT workflows and processes for data ingestion into Snowflake.
- Optimize ETL/ELT processes for scalability and performance.
- Collaborate with data engineers to automate and orchestrate data pipelines.
-
System Performance and Optimization
- Monitor and tune Snowflake performance for efficient query execution and resource utilization.
- Implement query optimization techniques, caching, and workload management.
- Conduct regular audits and troubleshoot to maintain system performance.
-
Data Security and Governance
- Implement data security measures, including access controls, encryption, and data masking.
- Ensure compliance with data governance policies and regulatory requirements.
- Develop and enforce data governance best practices within the Snowflake environment.
-
Leadership and Collaboration
- Provide technical leadership and mentorship to junior architects and data engineers.
- Document architectural designs, data flows, and processes for knowledge sharing.
- Stay updated with Snowflake advancements and industry trends to recommend improvements.
- Lead initiatives to enhance data platform capabilities and performance.
Role Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Proven experience of at least 7+ years in data architecture, data warehousing, and data engineering.
- Strong expertise in Snowflake and its ecosystem.
- Experience with ETL/ELT workflows, Azure durable functions, and data pipeline automation.
- Deep understanding of data modeling, query optimization, and performance tuning.
- Familiarity with data governance, compliance standards, and security best practices.
- Excellent problem-solving skills and ability to work collaboratively across teams.
Why Join Us?
- Work with a cutting-edge data platform to solve complex business challenges.
- Collaborate with a talented and passionate team of data professionals.
- Opportunity to influence and drive innovation in data architecture and engineering.