Role : Bigdata Enginer
Experience 6 to 8 Years
Location Pune
JD :
- Minimum 5 to 9 Years of in Big Data & Data related technology experience
- Expert level understanding of distributed computing principles
- Expert level knowledge and experience in Apache Beam
- Very good knowledge of Python and SQL (Java is plus)
- Handson streaming experience (Kafka Pub/Sub Dataflow Flink etc.)
- Handson GCP Data Flow BigQuery is required
- Strong understanding of SQL queries joins and relational schemas
- Experience with NoSQL databases such as HBase Cassandra MongoDB
- Experience with Data Modeling
- Knowledge of ETL techniques and frameworks
- Experience with designing and implementing Big data / distributed solutions
- Experience with Docker (Kubernetes is a plus)
- Practitioner of AGILE methodology e.g JIRA
- Excellent communication Presentation skills
- Should have CICD and GIT hands on Experience
- Telco Network exposure is preferred.
Skill Set :
- Mandatory Skills Level
- SQL Expert
- Python Expert
- Apache Beam Expert
- GCP Expert
- Spark Good
- No Sql DB Expert
- OZIEE /Airflow/Cloud Composer Good
- CICD / GIT Expert
- Data Modelling Good
- Stakeholder Management Good
- DATA Flow /Apache beam Pipelines Good
- Should have done End to End Project development Good
- Data Analysis Expert
- Communication Good
- Kafka / Spark Streaming Expert
- Good to Have Skills Level
- Docker Kubernetes Good
- Network Domain Added Advantage
- NATCO/TELCO Exposure Good
Remote Work :
No
Employment Type :
Fulltime