Do you love a career where you Experience Grow & Contribute at the same time while earning at least 10% above the market If so we are excited to have bumped onto you.
If you are a Snowflake SME : Senior Snowflake Developer Position looking for excitement challenge and stability in your work then you would be glad to come across this page.
We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long term project. Here are a few details.
Check if you are up for maximizing your earning/growth potential leveraging our Disruptive Talent Solution.
Role:Snowflake SME : Senior Snowflake Developerr
Location: Hyderabad/Bangalore
Hybrid Mode Position
Exp: 8 10 years
Requirements
We are seeking an experienced Senior Snowflake Developer (SME) to join our data engineering team. The ideal candidate will have 810 years of handson experience in data warehousing ETL processes and an indepth knowledge of the Snowflake cloud data platform. This role requires a strong background in data transformation SQL programming and the design and optimization of complex data models to support business analytics and reporting needs. You will play a key role in designing implementing and maintaining data solutions to ensure scalability performance and data quality.
Key Responsibilities:
Snowflake Data Model Design:
- Design develop and optimize scalable Snowflake data models to meet our organization s data transformation and analytics needs.
- Implement best practices for Snowflake architecture and optimize data warehouse performance to ensure efficient query execution.
- Develop and maintain Snowflake solutions by utilizing Snowflake features like Time Travel Cloning and Virtual Warehouses.
SQL Programming:
- Write efficient scalable and complex SQL queries and stored procedures to extract manipulate and transform data within Snowflake.
- Develop and optimize queries for complex data transformations ensuring efficient performance across large datasets.
- Collaborate with analytics teams to create data pipelines that support various reporting and analytics needs.
ETL Pipeline Development:
- Design and implement robust ETL (Extract Transform Load) pipelines to load structured and unstructured data into Snowflake from multiple sources.
- Maintain and enhance existing ETL workflows ensuring data integrity accuracy and timely data delivery.
- Collaborate with data engineering and analytics teams to ensure data transformation processes meet business requirements.
Data Warehousing & Best Practices:
- Apply advanced data warehousing concepts such as partitioning indexing and data optimization techniques.
- Implement data modeling techniques like star schema snowflake schema and fact and dimension tables to ensure scalable and maintainable data solutions.
- Perform regular performance tuning and optimization of Snowflake tables queries and scripts to ensure efficient resource usage.
Python Scripting for Data Workflows:
- Write debug and maintain Python scripts (including libraries such as Pandas NumPy and Matplotlib) for automating data workflows and supporting ETL pipelines.
- Leverage Python for integrating various data sources and performing advanced data analysis and visualization.
Data Governance & Security:
- Implement data governance best practices including data security privacy and compliance with regulatory policies.
- Ensure proper data encryption access control and audit trails in Snowflake to safeguard sensitive data.
- Collaborate with security teams to address data privacy concerns and ensure compliance with industry standards like GDPR and HIPAA.
Business Requirements Translation:
- Work closely with business analysts and stakeholders to translate complex business requirements into technical specifications and data models.
- Understand and document business processes data requirements and reporting needs ensuring that Snowflake solutions align with business goals.
Collaboration & Communication:
- Collaborate effectively with crossfunctional teams including data analysts data scientists business intelligence teams and developers to ensure seamless data operations.
- Provide technical mentorship to junior engineers sharing best practices and knowledge around Snowflake and data warehousing solutions.
- Communicate clearly and effectively with both technical and nontechnical stakeholders explaining complex technical concepts in simple terms.
Required Skills & Qualifications:
- Snowflake Expertise: Extensive handson experience with the Snowflake cloud data platform including data modeling optimization and performance tuning.
- SQL Proficiency: Advanced SQL programming skills with experience in writing complex queries stored procedures and optimizing SQL performance.
- ETL Development: Proven experience in designing implementing and maintaining ETL pipelines using Snowflake and other data integration tools.
- Data Warehousing Knowledge: Strong understanding of data warehousing concepts including data modeling techniques like star schema and snowflake schema.
- Python Skills: Proficiency in Python programming including using libraries such as Pandas NumPy and Matplotlib to support data workflows and automation.
- Data Governance & Security: Familiarity with data governance practices security policies and regulatory compliance related to data privacy.
- Analytical & ProblemSolving Skills: Strong analytical skills with the ability to translate complex business requirements into technical specifications and efficient data solutions.
- Communication & Collaboration: Excellent communication skills and the ability to work collaboratively with crossfunctional teams in a fastpaced environment.
Preferred Qualifications:
- Experience with ETL Tools: Handson experience with ETL tools such as Talend Informatica or Apache NiFi for data integration.
- Cloud Platforms: Experience with cloud platforms like AWS Azure or Google Cloud especially using their datarelated services (e.g. AWS Redshift Azure Synapse).
- CI/CD & DevOps Practices: Knowledge of integrating Snowflake solutions into CI/CD pipelines and working within DevOps environments.
- Data Visualization Tools: Familiarity with data visualization tools such as Tableau Looker or Power BI for delivering insights and reports to stakeholders.
Benefits
Communication Skills Negotiation skills