Do you love a career where you Experience Grow & Contribute at the same time while earning at least 10% above the market If so we are excited to have bumped onto you.
If you are a Senior Data Designers Position looking for excitement challenge and stability in your work then you would be glad to come across this page.
We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long term project. Here are a few details.
Check if you are up for maximizing your earning/growth potential leveraging our Disruptive Talent Solution.
Role:Oracle PeopleSoft
Location: Hyderabad/Bangalore/Gurgaon/Mumbai/Pune
Hybrid Mode Position
Exp: 8 years
Requirements
We are seeking a highly skilled and experienced Senior Data Designer with expertise in Snowflake and Data Modeling. In this role you will be responsible for designing and implementing robust scalable and efficient data models that support business analytics reporting and datadriven decisionmaking. You will work closely with crossfunctional teams including data engineers analysts and business stakeholders to ensure optimal data architecture governance and performance across the organization s Snowflake environment.
Key Responsibilities:
Data Architecture and Modeling:
- Design and develop efficient and scalable data models (conceptual logical and physical) within Snowflake to support a variety of analytics and business intelligence (BI) use cases.
- Define implement and maintain data structures including fact and dimension tables ensuring adherence to best practices in normalization and denormalization strategies.
- Optimize the Snowflake environment for performance by applying techniques such as clustering partitioning and caching ensuring lowlatency querying.
Collaboration and Stakeholder Engagement:
- Work closely with business analysts data engineers and data scientists to gather requirements translate business needs into technical specifications and ensure alignment of data architecture with business goals.
- Collaborate with ETL/ELT developers to define and optimize data pipelines and workflows ensuring data is ingested transformed and stored efficiently.
- Engage with business stakeholders to ensure the data models meet the evolving needs of the organization and are adaptable to new use cases and data sources.
Data Governance and Quality:
- Implement and enforce data governance principles ensuring data accuracy consistency and integrity across the Snowflake platform.
- Define and document data standards metadata and data dictionaries ensuring alignment with organizational policies and compliance standards.
- Monitor data quality and implement solutions for data cleansing deduplication and enrichment to ensure high data integrity.
Performance Tuning and Optimization:
- Perform ongoing performance tuning of Snowflake data models ensuring efficient querying storage optimization and minimal resource usage.
- Identify and resolve performance bottlenecks in data models ETL/ELT processes and Snowflake configurations.
- Apply best practices for workload management in Snowflake including clustering micropartitioning and caching strategies.
Technology and Tools:
- Leverage Snowflake features such as Time Travel ZeroCopy Cloning Streams and Tasks to build efficient and innovative data solutions.
- Work with data ingestion tools such as Kafka Fivetran DBT and others to automate data pipelines and ensure realtime data flow.
- Ensure seamless integration between Snowflake and BI tools (e.g. Tableau Power BI) for realtime reporting and dashboards.
Required Skills & Qualifications:
- 5 years of handson experience in data modeling data architecture and Snowflake development.
- Strong expertise in Snowflake architecture including features like virtual warehouses clustering keys query performance optimization and security configurations.
- Extensive experience in designing and implementing Star Schema Snowflake Schema and 3NF data models for enterprise data warehouses.
- Proficiency in SQL and query optimization techniques with the ability to design and optimize complex queries.
- Handson experience with ETL/ELT tools (e.g. Informatica DBT Fivetran) for data extraction transformation and loading.
- Knowledge of data governance frameworks including experience with metadata management data lineage and data quality assurance.
- Understanding of cloudbased data warehousing and analytics tools with a focus on AWS Azure or GCP.
- Experience in Agile/Scrum methodologies and ability to work in fastpaced collaborative environments.
Preferred Qualifications:
- Snowflake certification (SnowPro Core/Advanced) is highly desirable.
- Experience with big data technologies such as Apache Kafka Apache Airflow or Databricks.
- Familiarity with data security privacy standards and regulatory requirements such as GDPR or CCPA.
- Proficiency in Python or JavaScript for scripting and automation of data processes.
- Experience with realtime data processing tools and techniques.
Personal Attributes:
- Strong analytical and problemsolving skills with a keen attention to detail.
- Excellent communication skills both verbal and written with the ability to convey complex technical concepts to nontechnical stakeholders.
- Proactive mindset able to anticipate and resolve challenges before they become issues.
- Collaborative team player who can work effectively in a crossfunctional environment.
Why Join Us:
- Opportunity to work with cuttingedge cloud and data technologies.
- Competitive salary and benefits package.
- Flexible work environment with opportunities for remote work.
- Career growth and professional development opportunities in a dynamic datadriven organization.
Benefits
In-memory data processing (IMDB) tools and techniques Hosting and real time data capturing and messaging tools (Kafka etc.) Implementation of open table concepts (Apache Iceberg etc.)