Job Title: DMP Operations Engineer
Location: Pune India
No. of Positions: 1
Experience: 45 years
Budget: 14 LPA
Job Description:
We are seeking an experienced and detailoriented Data yst to join our dynamic team. The ideal candidate will have a solid foundation in data management SQL and Python and will be responsible for managing and optimizing largescale data platforms. This role involves handson work with complex ytics solutions automation building and ensuring highquality data performance and availability.
Key Responsibilities:
- Manage and maintain the Data Management Platform (DMP) to ensure optimal performance of large datasets.
- Develop automate and optimize SQL scripts and queries to support data ysis and operations.
- Build automation processes using Python to streamline data workflows and improve efficiency.
- Work with cloudbased solutions particularly AWS including EC2 S3 Lambda Glue RDS and EMR.
- Implement and manage Kubernetes clusters using AWS EKS ensuring smooth operation of external products.
- Monitor trouble and resolve datarelated issues ensuring high data quality and system performance.
- Support integration projects using ETL Kafka and REST APIs.
- Collaborate with DevOps teams to manage GitLab Pipelines and Infrastructure as Code (IaC) deployments.
- yze and monitor the availability and performance of applications and data infrastructure.
- Provide L2 and L3 support for data operations troubleing issues and identifying areas for improvement.
- Maintain and improve data security governance and compliance within the DMP.
MustHave Ss:
- DMP Operations: Handson experience with managing and operating Data Management Platforms.
- SQL: Strong expertise in SQL scripting query optimization and working with relational databases.
- Python: Experience in Python for automation scripting and data ysis.
- AWS: Knowledge of AWS services such as EC2 S3 Lambda Glue RDS EMR and experience with account management.
- Kubernetes: Experience with AWS EKS and managing Kubernetes clusters.
- Database Performance: Understanding of largescale database performance troubleing and optimization.
- Integration Technologies: Knowledge of ETL Kafka and REST API integrations.
- Version Control: Handson experience with GitLab and GitLab Pipelines.
- Application Monitoring: Experience with monitoring application availability and performance.
Preferred Ss:
- Experience with Amazon EMR and data processing frameworks.
- Familiarity with AWS Glue Glue Jobs and AWS Batch.
- Knowledge of IaC tools and practices (e.g. Terraform CloudFormation).
- Understanding of DataBricks for advanced data ytics.
- Strong problemsolving ss with a "CanDo" attitude and eagerness to learn new technologies.
Qualifications:
- Bachelor s degree in Computer Science Information Technology Data Science or a related field.
- 45 years of experience in data ysis SQL scripting and cloudbased data infrastructure.
AWS Account Management Python SQL scripting and querying Large Database Performance Data Quality and Troubleing Kubernetes (AWS EKS) EC2, S3, Lambda AWS Glue, Glue Jobs Databricks RDS Amazon EMR L2 and L3 Support GitLab, GitLab Pipelines ETL Integrations Kafka, REST API IaC (Infrastructure as Code)