drjobs Data EngineerETL Developer العربية

Data EngineerETL Developer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Jobs by Experience drjobs

5years

Job Location drjobs

Hyderabad - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

As a Data Engineer/ETL Developer you will be responsible for designing developing and maintaining our global data warehouse and data marts on the Snowflake Platform. You will play a pivotal role in architecture solutions for complex data integration challenges ensuring the scalability performance and reliability of our data pipelines.

Requirements

.

Primary Responsibilities:


Design develop and implement ETL processes using Snowflake technologies to extract transform and load data from various source systems into our global data warehouse.

Collaborate with crossfunctional teams to understand business requirements and translate them into technical specifications for data integration and ETL workflows.

Optimize ETL processes for performance scalability and reliability to support the processing of large volumes of data.

Develop and maintain data models schemas and structures within Snowflake to support efficient data storage and retrieval.

Implement data quality checks validation routines and error handling mechanisms to ensure the accuracy and integrity of data.

Work closely with data engineers data analysts and business stakeholders to troubleshoot issues identify opportunities for optimization and drive continuous improvement of our data solutions.

Document technical designs configurations and best practices for ETL processes and data warehouse components.

Stay updated on industry trends and emerging technologies related to data warehousing ETL and cloud computing and provide recommendations for adopting new tools and techniques to enhance our data capabilities.

MustHave Skills:

  1. Strong SQL knowledge.
  2. Deep understanding of data warehousing concepts dimensional modeling and data mart design principles.
  3. Experience with Snowflake features such as SnowSQL Snowpipe tasks and stored procedures is preferred.
  4. Experience in creating complex queries and stored procedures.
  5. Proven track record of architecting scalable and performant data warehouse solutions handling large volumes of data and complex data transformations.
  6. Proficiency in utilizing the Azure Cloud platform including Azure Data Factory Azure Storage and Azure Databricks.
  7. Proficiency in scripting languages such as Python Shell scripting or similar for automation and data manipulation.
  8. Proficiency in Job Scheduling tools (Tidal)


GoodtoHave Skills:

  1. Familiarity with Informatica Power Center.
  2. Familiarity with tools like SSIS.
  3. Python/Java


Benefits


1. Culture:

Open Door Policy: Encourages open communication and accessibility to management.

Open Office Floor Plan: Fosters a collaborative and interactive work environment.

Flexible Working Hours: Allows employees to have flexibility in their work schedules.

Employee Referral Bonus: Rewards employees for referring qualified candidates.

Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.


2. Inclusivity and Diversity:


Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.

Mandatory POSH training: Promotes a safe and respectful work environment.

3. Health Insurance and Wellness Benefits:


GMC and Term Insurance: Offers medical coverage and financial protection.

Health Insurance: Provides coverage for medical expenses.

Disability Insurance: Offers financial support in case of disability


4. Child Care & Parental Leave Benefits:

Companysponsored family events: Creates opportunities for employees and their families to bond.

Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.

Family Medical Leave: Offers leave for employees to take care of family members medical needs.


5. Perks and TimeOff Benefits:

Companysponsored outings: Organizes recreational activities for employees.

Gratuity: Provides a monetary benefit as a token of appreciation.

Provident Fund: Helps employees save for retirement.

Generous PTO: Offers more than the industry standard for paid time off.

Paid sick days: Allows employees to take paid time off when they are unwell.

Paid holidays: Gives employees paid time off for designated holidays.

Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.


6. Professional Development Benefits:

L&D with FLEX Enterprise Learning Repository: Provides access to a learning repository for professional development.

Mentorship Program: Offers guidance and support from experienced professionals.

Job Training: Provides training to enhance jobrelated skills.

Professional Certification Reimbursements: Assists employees in obtaining professional certifications.

Promote from Within: Encourages internal growth and advancement opportunities.




. Primary Responsibilities: Design, develop, and implement ETL processes using Snowflake technologies to extract, transform, and load data from various source systems into our global data warehouse. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications for data integration and ETL workflows. Optimize ETL processes for performance, scalability, and reliability to support the processing of large volumes of data. Develop and maintain data models, schemas, and structures within Snowflake to support efficient data storage and retrieval. Implement data quality checks, validation routines, and error handling mechanisms to ensure the accuracy and integrity of data. Work closely with data engineers, data analysts, and business stakeholders to troubleshoot issues, identify opportunities for optimization, and drive continuous improvement of our data solutions. Document technical designs, configurations, and best practices for ETL processes and data warehouse components. Stay updated on industry trends and emerging technologies related to data warehousing, ETL, and cloud computing, and provide recommendations for adopting new tools and techniques to enhance our data capabilities. Must-Have Skills: Strong SQL knowledge. Deep understanding of data warehousing concepts, dimensional modeling, and data mart design principles. Experience with Snowflake features such as SnowSQL, Snowpipe, tasks, and stored procedures is preferred. Experience in creating complex queries and stored procedures. Proven track record of architecting scalable and performant data warehouse solutions, handling large volumes of data and complex data transformations. Proficiency in utilizing the Azure Cloud platform, including Azure Data Factory, Azure Storage, and Azure Databricks. Proficiency in scripting languages such as Python, Shell scripting, or similar for automation and data manipulation. Proficiency in Job Scheduling tools (Tidal) Good-to-Have Skills: Familiarity with Informatica Power Center. Familiarity with tools like SSIS. Python/Java

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.