Short Description
Nutanix is hiring a Hadoop Admin who can be responsible for capacity planning and estimating the requirements for scaling the capacity of the Hadoop cluster.Job Description
- Deploying & maintaining Hadoop cluster(s), adding and removing nodes using cluster monitoring tools like Ambari
- Configuring & ensuring the high availability of the Hadoop cluster
- Implementing, managing and administering the overall Hadoop infrastructure
- Managing and monitoring the workload(s) running in the Hadoop cluster
- Responsible for capacity planning and estimating the requirements for scaling the capacity of the Hadoop cluster
- Ensure that the Hadoop cluster is up and running all the time
- Monitoring the cluster connectivity and performance
- Manage and review Hadoop log files
- Data backup and recovery
- Resource and security management
- Explore new technologies in Big Data EcoSystem based on the business use cases
- At least 5 years of experience in building and maintaining enterprise-grade Hadoop clusters
- Working experience in Hadoop components like HDFS, Hive, Pig, Flume, Sqoop, Zookeeper, MapReduce, Kafka, YARN, HBase, etc
- Knowledge of UNIX/LINUX
- Knowledge of networking, CPU, memory, and storage
- Good understanding of OS concepts, process management, and resource scheduling
- Knowledge of Automation tools like Puppet or Chef
- Working knowledge of cluster monitoring tools like Ambari
- Knowledge of Shell Scripting
- Familiarity with various components in the Hadoop EcoSystem like Apache Spark, etc
- Knowledge of programming languages like Java or scripting languages like Python
- Experience with building & supporting Big Data Platform & other applications (on Hadoop)
Hadoop Admin