Job Description
Job Role Sr Data Engineer
Location Bangalore Pune
Salary Up to 32 LPA
Notice Period Immediate joiners 15 days only.
(Who can join form 20Jan )
Mandatory Key skills AWS Data ware house Integration ETLML
Key Responsibilities:
- Develop and implement portfolio Data modelling assurance and utilization strategies and frameworks that align with enterprise approved governance data and technology strategy and the Data COE. Lead the implementation of these strategies within the portfolio.
- Serve as though leader and guide in the data domain by sharing knowledge identifying problems patterns trends and support the development of relevant BI and MI solutions.
- Design and implement scalable and robust processes for ingesting and transforming complex datasets.
- Contribute to the development of architectural frameworks apply architecture principles and drive the development of data architecture models within the organization.
- Design and develop data models using dimensional modelling and data vault techniques and ensure stated business requirements are met by these models.
- Architect train validate and test advanced analytics / machine learning models using enterprisegrade software engineering practices.
- Design develops and maintain automated scalable data pipelines that improve estate performance stability and auditability. These include data pipelines for ETL processing. Monitor and troubleshoot data pipeline issues.
Minimum Qualifications/Experience
- Bachelors or master s degree in computer science Information Technology or a related field.
- 8 years of experience in data engineering with a focus on leadership and project management.
- Data warehouse technical experience definition /implementation/ integration.
- Strong programming skills in Python and DBA skills (SQL/PSQL/DynamoDB or other).
- Experience with data pipeline and ETL tools and reporting/analytics tools including but not limited to any of the following combinations (1) SSIS and SSRS (2) ETL Frameworks (3) Data conformance (4) Caching (5) Spark (6) AWS data builds.
- Experience with data modelling data governance and data quality.
- Strong problemsolving skills and ability to work in a fastpaced environment.
- Strong communication skills and ability to work in a team.
- Expertise in Machine Learning (ML) and deep learning frameworks.
- Explaining the thinking behind simple ML algorithms.
Proficiency in all aspects of model architecture data pipeline interaction and metrics interpretation.
ml,integration,architecture,pipeline,ssrs,machine learning,spark,sql,python,etl,aws,models,learning,data warehousing,psql,design,data,ssis,dynamodb,skills