Hadoop Administrator/Big Data HIGHLIGHTS
Location: Scottsdale, AZ
Position Type: 6 Month Contract to Hire
Hourly / Salary: BOE
Residency Status: US Citizen or Green Card Holder ONLY
Our client is looking for a Hadoop Administrator to join their team!
Job Description This position designs, develops, tests and maintains infrastructure as code, CICD patterns, Configuration Management and containerized product applications.
Skill Set
- Certification in Terraform, AWS, and/or Kubernetes
- Strong Administration knowledge of technologies (Hadoop, HBase, Kafka, Rancher) within Cloudera Eco-System
- Strong Administration knowledge for installation and tuning of Model Lifecycle management software DataRobot.
- PySpark Job tuning
- Chef and/or Ansible
Essential Functions
- Design, develop, document, test and debug new and existing Configuration management patterns and infrastructure as code following SDLC model.
- Contribute to requirements analysis and design a model for Infrastructure and application flow.
- Provide input in design meetings and analyzes user needs to determine technical requirements.
- Write technical specifications (based on conceptual design and business requirements).
- Identify and evaluate new technologies for implementation.
- Analyze results, failures and bugs to determine the causes of errors and tune the automation pipeline to fix the problems to have desired outcome.
- Consult and facilitate with end user and partner teams to prototype, refine, test, and debug programs to meet needs. Collaborate to identify and solve issues within established time lines.
- Proactively monitors health of environment and act on fixing any issues and improves the performance of environments.
- Assist new staff on getting them up to speed on team policies, procedures, use cases and best patterns.
- Support and maintain products and add new features.
- Support to UAT/production applications as needed.
- Participate in and follow change management processes for change implementation.
- Support the company’s commitment to risk management and protecting the integrity and confidentiality of systems and data
Minimum Qualifications
- Education and/or experience typically obtained through completion of a Bachelor’s Degree in Computer Science or equivalent certifications.
- Minimum of 5 years prior DevOps, software engineering or related experience.
- Must be able to work different schedules as part of an on-call rotation.
- Ability to work on multiple projects and general understanding of software environments and network topologies
- Working and troubleshooting experience on Linux based applications
- Demonstrable technical design skills
- Demonstrable experience in modern application design
- Solid understanding of an iterative software development process
- Ability to use Unix/Linux command line programs and create/edit scripts
- Knowledge of one or more of the tools – Chef, Ansible, puppet.
- Knowledge of one or more of the tools – IAC, Containerization and orchestration (Terraform, Docker & Kubernetes)
- Knowledge of one of the cloud infrastructure providers – AWS, GCP and Azure
- Background and drug screen.
Preferred
- Certification in Terraform, AWS, and/or Kubernetes
- Strong Administration knowledge of technologies (Hadoop, HBase, Kafka, Rancher) within Cloudera Eco-System
- Strong Administration knowledge for installation and tuning of Model Lifecycle management software DataRobot.
- PySpark Job tuning
- Have supported Data Science team
- Chef and/or Ansible
"We are GTN – The Go To Network"