3,678 Hadoop Administrator jobs in India
Hadoop Administrator
Posted today
Job Viewed
Job Description
- Manage, configure, and administer Hadoop ecosystem components including HDFS, YARN, Zookeeper, Hue, Ranger, Spark, and Hive.
- Perform cluster monitoring, tuning, troubleshooting, and performance optimization across on-prem and cloud environments.
- Implement security practices including Kerberos authentication, LDAP integration, and encryption (at rest/in transit).
- Set up and maintain high availability, backup, and disaster recovery solutions .
- Administer and optimize AWS EMR clusters , including scaling strategies and cost optimization.
- Work with AWS services : Glue (ETL pipelines), Athena (query optimization), S3 (data lake management), and IAM (access and policy management).
- Develop and maintain automation scripts (Shell, Python) and Infrastructure-as-Code solutions (CloudFormation).
- Support Hive Meta store , query optimization, and Spark performance tuning .
- Oversee logging, monitoring, and auditing practices using tools such as CloudWatch and other monitoring solutions.
- Collaborate with data engineering and analytics teams to support ongoing projects and optimize workflows.
Requirements
- Proven hands-on experience in Hadoop Administration (on-prem & AWS).
- Strong knowledge of Linux system administration .
- Proficiency in Shell scripting (Bash) and Python scripting .
- Experience with cluster management, scaling, and troubleshooting .
- Expertise in AWS EMR, Glue, Athena, S3, IAM , and infrastructure automation (CloudFormation).
- Solid understanding of security frameworks and data governance .
- Strong problem-solving, analytical, and communication skills.
Hadoop Administrator
Posted today
Job Viewed
Job Description
• Expertise in Cluster maintenance using tools like IBM BigInsights, Cloudera/Hortonworks Ambari, Ganglia, etc ·
• Good at Performance tuning of Hadoop clusters and Hadoop MapReduce routines
• Experience on a hive, HBase, sqoop, RDBMS and Hadoop ecosystem ·
• Experience on troubleshooting, backup, and recovery ·
• Expertise in Creating new MongoDB databases, instances, Database objects/views, etc
• Expertise in Creating new MongoDB databases, instances, Database objects/views, etc · Manage and review Hadoop log files. ·
Hadoop Administrator
Posted today
Job Viewed
Job Description
• Expertise in Cluster maintenance using tools like IBM BigInsights, Cloudera/Hortonworks Ambari, Ganglia, etc ·
• Good at Performance tuning of Hadoop clusters and Hadoop MapReduce routines
• Experience on a hive, HBase, sqoop, RDBMS and Hadoop ecosystem ·
• Experience on troubleshooting, backup, and recovery ·
• Expertise in Creating new MongoDB databases, instances, Database objects/views, etc
• Expertise in Creating new MongoDB databases, instances, Database objects/views, etc · Manage and review Hadoop log files. ·
Big Data/Hadoop Administrator
Posted 1 day ago
Job Viewed
Job Description
At ClearTrail, work is more than ‘just a job’. Our calling is to develop solutions that empower those dedicated to keeping their people, places and communities safe. For over 23 years, law enforcement & federal agencies across the globe have trusted ClearTrail as their committed partner in safeguarding nations & enriching lives. We are envisioning the future of intelligence gathering by developing artificial intelligence and machine learning-based lawful interception & communication analytics solutions that solve the world’s most challenging problems.
Role- Big Data/Hadoop Administrator
Location – Indore, MP
Experience Required – 3 to 5 Years
What is your Role?
You will work in a multi-functional role with a combination of expertise in System and Hadoop administration. You will work in a team that often interacts with customers on various aspects related to technical support for deployed system. You will be deputed at customer premises to assist customers for issues related to System and Hadoop administration. You will Interact with QA and Engineering team to co-ordinate issue resolution within the promised SLA to customer.
What will you do?
- Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem.
- Installing Linux Operating System and Networking.
- Writing Unix SHELL/Ansible Scripting for automation.
- Maintaining core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, HBASE etc.
- Takes care of the day-to-day running of Hadoop clusters using Ambari/Cloudera manager/Other monitoring tools, ensuring that the Hadoop cluster is up and running all the time.
- Maintaining HBASE Clusters and capacity planning.
- Maintaining SOLR Cluster and capacity planning.
- Work closely with the database team, network team and application teams to make sure that all the big data applications are highly available and performing as expected.
- Manage KVM Virtualization environment.
Must Have Skills -
- Technical Domain: Linux administration, Hadoop Infrastructure and Administration, SOLR, Configuration Management (Ansible etc).
- Linux Administration
- Experience in Python and Shell Scripting
- Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem
- Knowledge of Hadoop core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK etc.
- Knowledge of HBASE Clusters
- Working knowledge of SOLR, Elastic Search
Good to Have Skills:
- Experience in Networking Concepts
- Experience in Virtualization technology, KVM, OLVM
Big Data/Hadoop Administrator
Posted today
Job Viewed
Job Description
Role- Big Data/Hadoop Administrator
Location – Indore, MP
Experience Required – 3 to 5 Years
What is your Role?
You will work in a multi-functional role with a combination of expertise in System and Hadoop administration. You will work in a team that often interacts with customers on various aspects related to technical support for deployed system. You will be deputed at customer premises to assist customers for issues related to System and Hadoop administration. You will Interact with QA and Engineering team to co-ordinate issue resolution within the promised SLA to customer.
What will you do?
- Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem.
- Installing Linux Operating System and Networking.
- Writing Unix SHELL/Ansible Scripting for automation.
- Maintaining core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, HBASE etc.
- Takes care of the day-to-day running of Hadoop clusters using Ambari/Cloudera manager/Other monitoring tools, ensuring that the Hadoop cluster is up and running all the time.
- Maintaining HBASE Clusters and capacity planning.
- Maintaining SOLR Cluster and capacity planning.
- Work closely with the database team, network team and application teams to make sure that all the big data applications are highly available and performing as expected.
- Manage KVM Virtualization environment.
Must Have Skills -
- Technical Domain: Linux administration, Hadoop Infrastructure and Administration, SOLR, Configuration Management (Ansible etc).
- Linux Administration
- Experience in Python and Shell Scripting
- Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem
- Knowledge of Hadoop core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK etc.
- Knowledge of HBASE Clusters
- Working knowledge of SOLR, Elastic Search
Good to Have Skills:
- Experience in Networking Concepts
- Experience in Virtualization technology, KVM, OLVM
Be The First To Know
About the latest Hadoop administrator Jobs in India !