- SR Hadoop
- Strong experience in administration of Big data platform and the allied toolset Big data platform software from Hortonworks Cloudera MapR
- Experience working on secured environments using a variety of technologies like Kerberos Knox Ranger KMS Encryption zone Server SSL certificates
- Prior experience of Linux system administration
- Good experience of Hadoop capacity planning in terms of HDFS file system Yarn resources
- Good stakeholder management skills able to engage in formal and casual conversations and driving the right decisions
- Good troubleshooting skills able to identify the specific service causing issues reviewing logs and able to identify problem entries and recommend solution working with product vendor
- Capable of reviewing and accepting challenging solutions provided by product vendors for platform optimization and root cause analysis tasks
- Experience in doing product upgrades of the core big data platform cluster expansion setting up High availability for core services
- Good knowledge of Hive as a service HBase Kafka Spark
- Knowledge of basic data pipeline tools like Sqoop File ingestion Distcp and their optimal usage patterns
- Knowledge of the various file formats and compression techniques used within HDFS and ability to recommend right patterns based on application use cases
- Exposure to Amazon Web services AWS Google cloud platform GCP services relevant to big data landscape their usage patterns and administration
- Working with application teams and enabling their access to the clusters with the right level of access control and logging using Active directory AD and big data tools
- Setting up disaster recovery solutions for clusters using platform native tools and custom code depending on the requirements
- Configuring Java heap and allied parameters to ensure all Hadoop services are running at their optimal best
- Working knowledge of Hortonworks Data flow HDF architecture setup and ongoing administration
- Significant experience on Linux shell scripting Python or Perl scripting
- Experience with industry standard version control tools Git GitHub Subversion and automated deployment testing tools Ansible Jenkins Bamboo etc
- Worked on projects with Agile Devops as the product management framework good understanding of the principles and ability to work as part of the POD teams
- Working knowledge of opensource RDBMS MySQL Postgres Maria DB
- Ability to go under the hood for Hadoop services Ambari Ranger etc that use DB as the driver
- To be successful in this role you should meet the following requirements
- Communication and interpersonal skills
- Effective time management
- Should be well versed with ITIL concepts and more specific understanding of Service operations activities
- Work in an infrastructure critical environment and understands how technology adds value to the business and ultimately the end customer
- Ability to work effectively as a team player
- Working across cultures
management,troubleshooting,big data tools,linux system administration,agile,big data,hadoop administration,scripting,data,cloud services,shell scripting,capacity planning,version control,access control,itil,application,hadoop,access,skills,linux,data pipeline tools,apache hadoop