37,117 Hadoop jobs in India
Hadoop Admin
Posted today
Job Viewed
Job Description
Job description:
Job Description
Role Purpose
The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists.
͏
Do
- Oversee and support process by reviewing daily transactions on performance parameters
- Review performance dashboard and the scores for the team
- Support the team in improving performance parameters by providing technical support and process guidance
- Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions
- Ensure standard processes and procedures are followed to resolve all client queries
- Resolve client queries as per the SLA's defined in the contract
- Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting
- Document and analyze call logs to spot most occurring trends to prevent future problems
- Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution
- Ensure all product information and disclosures are given to clients before and after the call/email requests
- Avoids legal challenges by monitoring compliance with service agreements
͏
- Handle technical escalations through effective diagnosis and troubleshooting of client queries
- Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements
- If unable to resolve the issues, timely escalate the issues to TA & SES
- Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions
- Troubleshoot all client queries in a user-friendly, courteous and professional manner
- Offer alternative solutions to clients (where appropriate) with the objective of retaining customers' and clients' business
- Organize ideas and effectively communicate oral messages appropriate to listeners and situations
- Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA's
͏
- Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client
- Mentor and guide Production Specialists on improving technical knowledge
- Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist
- Develop and conduct trainings (Triages) within products for production specialist as per target
- Inform client about the triages being conducted
- Undertake product trainings to stay current with product features, changes and updates
- Enroll in product specific and any other trainings per client requirements/recommendations
- Identify and document most common problems and recommend appropriate resolutions to the team
- Update job knowledge by participating in self learning opportunities and maintaining personal networks
͏
Deliver
NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance
Experience: 5-8 Years
.
Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Hadoop Admin
Posted today
Job Viewed
Job Description
Work Location:
Pune
(Work-from-office)
Notice Period:
Immediate to 30 Days
Experience:
2+ Years
Must Have Skills:
Strong Hadoop Administration
– Hands-on experience managing and supporting Hadoop clusters (Cloudera CDP/CDH, Hortonworks HDP).
Linux/Unix System Administration
– Expertise in RedHat, CentOS, Ubuntu, with deep knowledge of OS internals.
Cluster Security & Encryption
– Setting up Kerberos, data-at-rest/data-in-transit encryption, Ranger, and related security frameworks.
Big Data Ecosystem Tools
– Experience with YARN, HDFS, Zookeeper, Hive, Spark, HBase, and Atlas.
Scripting & Automation
– Proficiency in Shell/Bash/Python for automation, configuration management, and troubleshooting.
If you believe your skills align with this role, we encourage you to
apply directly
.
If you know someone who would be a strong fit, please feel free to
refer or share this opportunity
with them
Kindly do not reapply if you have applied within the last 3 months.
Hadoop Administrator
Posted today
Job Viewed
Job Description
- Installation of Spark cluster in Cloud and in On-premise Environment, assist in Offline installation
- Upgradation from HD Insights to Spark cluster in Azure cloud.
- Upgradation from cloudera to Spark cluster in On-premise.
- Implementation of Autoscaling for Spark Cluster
- Troubleshoot issues, providing expert-level support. Resolve issues or escalate to Systems team
- architect Spark/Hadoop and integrate them with applications to meet business needs.
- Establish best practices for administering and securing Spark/Hadoop environments.
- Monitor application and system performance, implementing evaluation and optimization techniques.
- Develop and implement job automation using scripts, tools, and Cloudera features.
- Implement solutions to meet specific customer or business needs.
- Research, evaluate and provide guidance in the selection of 3rd party applications compatible with Spark/Hadoop/Cloudera environments.
- Maintain a roadmap for Hadoop and Cloudera systems and assist leadership in setting priorities.
- Provide timely and accurate status reporting.
- Manage projects to build new workflows, functionality, and reporting capabilities in Hadoop/Cloudera
- Extensive experience with Hadoop, including Cloudera Administration.
Hadoop Admin
Posted 2 days ago
Job Viewed
Job Description
Must-Have
- 4 to 7 years of working exp as an administrator/developer in Cloudera Hadoop distri bution ecosystem namely CDP Data Science (Data Warehouse (DW), Data Engineering (DE), Machine learning (ML), HDFC, Ozone, Iceberg, YARN, Impala, Spark, Java, Oozie, Kerberos/Active Directory/LDAP, etc in the capacity of a system administrator/platform engineer/developer
- 5+ years of work experience in unix shell scripting
- Monitor and optimize performance of Hadoop clusters and components
- Ability to analyze, troubleshoot Hadoop platform issues, services and drive them towards resolution in a competitive manner within the SLAs
- Able to understand and do a proactive monitoring of Hadoop clusters to contribute in Platform Stability
Good-to-Have
1. Experience in CI/CD tools, application hosting, containerization concepts
3. Exposure to cloud computing and object storage services/platforms
Exp Range: 5 TO 10
Location: Chennai/Hyderabad/Bangalore/Mumbai/Indore
Interview Type: Weekend Virtual Drive
Date: 12-Oct-2025
Day: Sunday
Hadoop Developer
Posted 10 days ago
Job Viewed
Job Description
Work Location: Pune (Work-from-office)
⏳ Notice Period: Immediate to 30 Days
Experience: 2+ Years
Must Have Skills:
- Strong Hadoop Administration – Hands-on experience managing and supporting Hadoop clusters (Cloudera CDP/CDH, Hortonworks HDP).
- Linux/Unix System Administration – Expertise in RedHat, CentOS, Ubuntu, with deep knowledge of OS internals.
- Cluster Security & Encryption – Setting up Kerberos, data-at-rest/data-in-transit encryption, Ranger, and related security frameworks.
- Big Data Ecosystem Tools – Experience with YARN, HDFS, Zookeeper, Hive, Spark, HBase, and Atlas.
- Scripting & Automation – Proficiency in Shell/Bash/Python for automation, configuration management, and troubleshooting.
If you believe your skills align with this role, we encourage you to apply directly .
If you know someone who would be a strong fit, please feel free to refer or share this opportunity with them
Hadoop Developer
Posted 10 days ago
Job Viewed
Job Description
Job Description
SN Required Information Details
1 Role Developer / Module Lead
2 Required Technical Skill Set Hadoop
3 Desired Experience Range 4-8Yrs
4 Location of Requirement - Chennai, Hyderabad, Bangalore, Mumbai, Indore, Ahmedabad
5 Desired Competencies (Technical/Behavioral Competency)
Must-Have Extensive experience and hands on implementation experience with Spark, Scala, Impala, Hive, Kafka, SQOOP
Good-to-Have Extensive knowledge with Data frames, Data Sets and RDDs
SN Responsibility of / Expectations from the Role
1 Extensive experience with Design and Implementation of Big data solutions, preferably using Cloudera distribution
2 Working knowledge on Cloudera and Apache tools/utilities
Hadoop Admin
Posted 10 days ago
Job Viewed
Job Description
Job Description
Hadoop Administrator JD
5-7 Years Experience in Hadoop Engineering with working experience on Python Ansible DevOps methodologies
Primary Skills HDP CDP Linux Python Ansible and Kubernetes
Extensive experience on CDPHDP Cluster and Server build including Control nodes Worker nodes Edge nodes
Primary Skills : Hadoop, Hortonworks Data Platform, Cloudera Distribution of Hadoop, Linux, Python, Ansible, YAML Scripting and Kubernetes
Secondary Skills: Other Devops Tools
Be The First To Know
About the latest Hadoop Jobs in India !
Hadoop Developer
Posted 10 days ago
Job Viewed
Job Description
Dear Applicants,
Greeting from TCS!
I hope your day is going well!
At Tata Consultancy Services, were looking for someone like you for an Associate Consultant position. This opportunity aligns nicely with your experience in creating immersive technologies and innovative solutions.
TCS has urgent requirements for "Hadoop Developer"
Education: Minimum 15 years of full-time education (10th, 12th and Graduation)
Skills: HDFS, Hive, Spark, Sqoop, Flume, Oozie, Unix Script, Autosys
Role: Hadoop Developer
Exp: <5yrs
Location: Mumbai / Chennai / Hyderabad / Bangalore
---
JD~
Strong hands-on skills required on:
- Good understanding of Hadoop concepts including file system and Map Reduce.
- Hands on experience on Spark framework, Unix scripting, Hive queries, wriring UDF in Hive. Theortical knowledge and POC alone will not suffice.
- Good knowledge in Software Development Life Cycle and Project Development Lifecycle.
- Associate should be able to work independently and should have strong debugging skill in both Hive and Spark. Associate should have experience developing large scale systems, experience debugging and performance tuning, excellent software design skills, communication skills and ability to work with client partners.
---
Lets connect soon!
Hadoop Developer
Posted today
Job Viewed
Job Description
Job role - Hadoop developer
Experience - 2 to 10 years
Location - PAN INDIA
This Job Opportunity is for TOP Leading MNCs / Permanent role.
- Role- Hadoop Developer / Module Lead
Technical skills- Hadoop, Spark, Scala, Impala, Hive, Kafka
Must-Have
- Extensive experience and hands on implementation experience with Spark, Scala, Impala, Hive, Kafka, SQOOP
- Good-to-Have
- Extensive knowledge with Data frames, Data Sets and RDDs.
Hadoop Developer
Posted today
Job Viewed
Job Description
Title: Hadoop Developer
Location:
Belapur, Navi Mumbai
Time Zone:
IST (Indian Standard Time)
We are currently hiring a
Hadoop Developer
with
2+ years of hands-on experience
in Big Data technologies. The ideal candidate should have strong expertise in the Hadoop ecosystem, data processing frameworks, and performance optimization techniques, along with solid programming skills.
Key Responsibilities:
- Design, develop, and maintain scalable Big Data applications using Hadoop ecosystem tools.
- Work with HDFS, Hive, Spark, Pig, Sqoop, and related technologies for large-scale data processing.
- Write efficient MapReduce or Spark jobs for data transformation and analysis.
- Integrate structured, semi-structured, and unstructured data from multiple sources.
- Optimize queries and improve performance of data pipelines.
- Collaborate with data engineers, analysts, and business stakeholders to deliver end-to-end solutions.
- Ensure data quality, security, and reliability within the Hadoop environment.
Required Skills & Qualifications:
- 2+ years of experience working with Hadoop ecosystem components (HDFS, Hive, Pig, Spark, Sqoop, Oozie, Flume).
- Strong knowledge of SQL and experience with relational databases.
- Proficiency in one or more programming languages: Java, Python, or Scala.
- Hands-on experience with Apache Spark for both batch and streaming data processing.
- Familiarity with data ingestion techniques and ETL pipelines.
- Solid understanding of distributed systems and cluster computing.
- Excellent problem-solving and debugging skills.