As a Data Lake EngineerSnowflake you will:
- Design and architect data lakes on Snowflake to manage vast amounts of structured and semistructured data efficiently.
- Develop and maintain data models schemas and best practices for optimal data storage and retrieval.
- Collaborate with data scientists analysts and business stakeholders to understand data requirements and create appropriate data lake solutions.
- Write optimized SQL queries and stored procedures for data ingestion transformation and analysis.
- Develop and maintain data pipelines to load and process data from various sources (e.g. databases APIs files).
- Implement data quality checks and validation processes to ensure data accuracy and consistency.
- Create and manage user roles permissions and security settings within Snowflake.
- Monitor and optimize the performance and scalability of the Snowflake data lake.
- Troubleshoot and resolve data loading and processing issues effectively.
- Implement data governance and compliance policies to ensure data security and privacy.
- Communicate technical concepts and data insights to nontechnical stakeholders and contribute to the development and maintenance of data documentation.
What You Bring to the Table:
- 612 years of experience in data engineering with strong expertise in data lake development using Snowflake.
- Proficiency in SQL and Snowflake SQL and experience with data loading and transformation techniques (e.g. ETL ELT).
- Indepth understanding of data warehousing and data lake concepts.
- Experience with data modeling and schema design.
- Familiarity with Python or other scripting languages for data manipulation and automation.
- Strong experience with cloud platforms (AWS Azure GCP) is a plus.
- Excellent communication and interpersonal skills with the ability to work collaboratively in a team environment.
- Strong problemsolving and analytical skills with the ability to diagnose and resolve datarelated issues.
You should possess the ability to:
- Collaborate with various stakeholders to understand their data needs and design solutions that meet those needs.
- Effectively communicate technical concepts and data insights to a nontechnical audience.
- Work independently and manage tasks within a teamoriented environment.
- Troubleshoot and optimize data pipelines and Snowflake environments for better performance and scalability.
- Design and implement data governance policies to ensure compliance with security and privacy standards.
What we bring to the table:
- An opportunity to work on cuttingedge data technologies and architectures.
- A dynamic and collaborative work environment with a focus on innovation.
- A chance to contribute to impactful projects and shape data infrastructure at scale.
- Exposure to diverse cloud platforms and cuttingedge data tools.
As a Data Lake Engineer-Snowflake, you will: Design and architect data lakes on Snowflake to manage vast amounts of structured and semi-structured data efficiently. Develop and maintain data models, schemas, and best practices for optimal data storage and retrieval. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and create appropriate data lake solutions. Write optimized SQL queries and stored procedures for data ingestion, transformation, and analysis. Develop and maintain data pipelines to load and process data from various sources (e.g., databases, APIs, files). Implement data quality checks and validation processes to ensure data accuracy and consistency. Create and manage user roles, permissions, and security settings within Snowflake. Monitor and optimize the performance and scalability of the Snowflake data lake. Troubleshoot and resolve data loading and processing issues effectively. Implement data governance and compliance policies to ensure data security and privacy. Communicate technical concepts and data insights to non-technical stakeholders and contribute to the development and maintenance of data documentation. What You Bring to the Table: 6-12 years of experience in data engineering, with strong expertise in data lake development using Snowflake. Proficiency in SQL and Snowflake SQL, and experience with data loading and transformation techniques (e.g., ETL, ELT). In-depth understanding of data warehousing and data lake concepts. Experience with data modeling and schema design. Familiarity with Python or other scripting languages for data manipulation and automation. Strong experience with cloud platforms (AWS, Azure, GCP) is a plus. Excellent communication and interpersonal skills, with the ability to work collaboratively in a team environment. Strong problem-solving and analytical skills with the ability to diagnose and resolve data-related issues. You should possess the ability to: Collaborate with various stakeholders to understand their data needs and design solutions that meet those needs. Effectively communicate technical concepts and data insights to a non-technical audience. Work independently and manage tasks within a team-oriented environment. Troubleshoot and optimize data pipelines and Snowflake environments for better performance and scalability. Design and implement data governance policies to ensure compliance with security and privacy standards. What we bring to the table: An opportunity to work on cutting-edge data technologies and architectures. A dynamic and collaborative work environment with a focus on innovation. A chance to contribute to impactful projects and shape data infrastructure at scale. Exposure to diverse cloud platforms and cutting-edge data tools.