Responsibilities:
- Experience : 36 yrs
- Good development practices
- Hands on coder with good experience in programming languages like Java Python or Scala.
- Handson experience on the Big Data stack like Hadoop Mapreduce Spark Hbase and ElasticSearch.
- Good understanding of programming principles and development practices like checkin policy unit testing code deployment
- Self starter to be able to grasp new concepts and technology and translate them into large scale engineering developments
- Excellent experience in Application development and support integration development and data management.
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirement
- Hands on coder with good understanding on enterprise level code
- Design and implement APIs abstractions and integration patterns to solve challenging distributed computing problems
- Experience in defining technical requirements data extraction data transformation automating jobs productionizing jobs and exploring new big data technologies within a Parallel Processing environment.
Qualifications :
- Years of track record of relevant work experience and a computer Science or related technical discipline is required
- Experience with functional and objectoriented programming Java Python or Scala is a must.
- Handson experience on the Big Data stack like Hadoop Mapreduce Spark Hbase and ElasticSearch.
- Good understanding on AWS services and experienced in working with API s microservices.
- Effective communication skills (both written and verbal)
- Ability to collaborate with a diverse set of engineers data scientists and product managers
- Comfort in a fastpaced startup environment
java,python,aws,microservices,hadoop,hbase,elasticsearch