drjobs Data Engineer

Data Engineer

Employer Active

1 Vacancy
The job posting is outdated and position may be filled
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Pune - India

Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Details:

Position: Senior Data Engineer
Experience: 5 years
Work Mode: Hybrid Pune
Location: Preferred from Pune. (Relocation)
Shift: Time zone overlap time with US 8:30 pm to 11:30 pm IST

Candidates with a strong background in streaming technologies such as Kafka AWS MSK AWS Kinesis AWS Data Firehose and Snowpipe streaming. Essential skills
  • Ability to work independently
  • CloudOps expertise for setting up infrastructure using IaC tools like Terraform and Pulumi
  • Strong experience in building ETL pipelines.
  • Understanding delta loading and CDC in Snowflake is also crucial

Responsibilities:

  • Design develop and maintain a data platform that is accurate secure available and fast.
  • Engineer efficient adaptable and scalable data pipelines to process data.
  • Integrate and maintain a variety of data sources: different databases APIs SAASs files logs events etc.
  • Create standardized datasets to service a wide variety of use cases.
  • Develop subjectmatter expertise in tables systems and processes.
  • Partner with product and engineering to ensure product changes integrate well with the data platform.
  • Partner with diverse stakeholder teams understand their challenges and empower them with data solutions to meet their goals.
  • Perform data quality on data sources and automate and maintain a quality control capability.
Qualifications:
  • At least 5 years of experience as a Data Engineer
  • You have handson and deep experience with Star/Snowflake schema design data modeling data pipelining and MLOps. Experience in Data Warehouse technologies (e.g. Snowflake AWS Redshift etc)
  • Experience in AWS data pipelines (Lambda AWS glue Step functions etc)
  • Fintech or Financial services industry experience
  • You are proficient in SQL and at least one major programming language like Python Java
  • Experience with Data Analysis Tools such as Looker or Tableau
  • Experience with Pandas Numpy Scikitlearn and Jupyter notebooks preferred familiarity with Git GitHub and JIRA.
  • You have an eye for detail. You catch & resolve data quality issues before others do
  • You are always looking for opportunities to simplify automate tasks and build reusable components. You use good judgment in applying new technologies to solve business problems

snowflake,data warehouse technologies,jupyter notebooks,data analysis,looker,github,iac tools,aws kinesis,data quality,etl pipelines,delta loading,aws msk,snowpipe streaming,data integration,database,saas,numpy,cloudops expertise,data pipelines,kafka,sql,pulumi,java,events,restful apis,mlops,terraform,git,aws data firehose,aws,cdc,tableau,pandas,streaming technologies,aws data pipelines,scikit-learn,data platform design,jira,aws lambda,python

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.