2,233 Big Data jobs in India
Big Data
Posted today
Job Viewed
Job Description
Work Location:
Pune
(Work-from -Office mode)
Notice Period:
Immediate to 30 Days
Experience:
2+ years
Must Have skills:
- Hadoop Administration (Cloudera CDP/CDH, Hortonworks HDP)
– hands-on experience in setup, configuration, and support. - Linux/Unix System Administration
– strong knowledge of OS internals, shell/bash scripting, and automation. - Security Implementation
– Kerberos authentication, Ranger, SSL, encryption (data-at-rest & in-transit). - Big Data Ecosystem Management
– experience with YARN, HDFS, Hive, Spark, HBase, Zookeeper. - Cloud Experience (AWS)
– setting up and managing Hadoop platforms on AWS cloud native services.
If you are a Hadoop Admin/Big Data Engineer with strong experience in Hadoop services and are available to join within 30 days,
please apply
Note:
Candidates who have applied for this position in the last 3 months will not be reconsidered, so kindly refrain from reapplying
Big Data
Posted today
Job Viewed
Job Description
Job description
Some careers shine brighter than others.
If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer
In this role, you will:
- Design and develop robust ETL processes using Pentaho Data Integration (PDI) for data warehouse and data lake population.
- Ready to work in UK shift and provide on-call Production support over night/ weekend on need basis.
- Ensure data accuracy integrity and completeness across the pipeline.
- Optimize data flows for performance, cost and efficiency
- Ensure deliveries are aligned to Agile methodologies (e.g. Scrum, Kanban).
Requirements
To be successful in this role, you should meet the following requirements:
- Big Data: Hive, HDFS
- ETL: Pentaho Data Integration (PDI)
- Scripting: Shell/ Bash Scripting
- Scheduling: Control-M
- Version Control: GIT
- Soft Skills: Excellent communication team collaborative and problem-solving skills
HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Issued by – HSBC Software Development India
Big Data
Posted today
Job Viewed
Job Description
Greetings from Teknikoz
Experience : 5-7 Years
Skill Required: Digital: Big Data and Hadoop Ecosystems~Digital: Python~Digital: PySpark
Specific activities required
:
- Lead the implementation of infrastructure via code and provide strategic advice/recommendations for the development and advancement of Microsoft Azure technologies based on previous research on trends in public cloud environments.
- Integrate and automate the delivery of standardised Azure deployments, in conjunction with orchestration products such as Azure DevOps with Terraform, Azure ARM templates and other modern deployment technologies.
- Act as the escalation point for level three Azure-related issues, providing technical support and fault resolution, as well as guidance and mentoring of operational run teams, both locally and remotely throughout the organization.
- Ensure the appropriate gathering of business requirements and their translation into appropriate solutions.
- Maintain and deliver all related documentation for the design, development, build, and deployment methods used, ensuring the source of control of all applicable code is stored and managed properly.
- Provide guidance and assistance to al support teams.
- Provide complimentary support and leadership in hardening and security testing.
KEY COMPETENCIES:
Key competences: - Tertiary qualifications in a relevant discipline with relevant certifications in Microsoft Azure. - Worked as Data engineer on Azure Cloud - Good knowledge of Pyspark, Azure Data Factory - Comprehensive knowledge of public cloud environments and industry trends. - Significant experience supporting, , designing, and developing public cloud solutions via Infrastructure as Code, including Terraform and ARM. - Extensive DevOps experience. - The ability to communicate effectively and work collaboratively with diverse team members. - Demonstrated experience in security hardening and testing. - Proven ability in creating and updating accurate documentation. - Excellent verbal and written communication skills. - Willingness and flexibility to work outside of standard office hours, and on weekends as required
Big Data
Posted today
Job Viewed
Job Description
We're Hiring –
Big Data
Posted today
Job Viewed
Job Description
Why Join Us?
Are you inspired to grow your career at one of
India's Top 25 Best Workplaces in IT industry?
Do you want to do the best work of your life at one of the
fastest growing IT services companies
? Do you aspire to thrive in an award-winning work culture that
values your talent and career aspirations
? It's
happening right here
at Iris Software.
About Iris Software
At Iris Software, our vision is to be our client's most trusted technology partner, and the first choice for the industry's top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services.
Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation.
Working at Iris
Be valued, be inspired, be your best.
At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow.
Our employee value proposition (EVP) is about "Being Your Best" – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We're a place where everyone can discover and be their best version.
Job Description
Must have:
- 7+ years of years of dev experience
- Big data developer with exposure to Java
- Spark/Scala/Hive experience
- Good background in SQL
Nice To Have
- Familiarity/Experience with Cloud
Mandatory Competencies
Big Data - Big Data - HIVE
Big Data - Big Data - SPARK
Beh - Communication and collaboration
Database - Database Programming - SQL
Programming Language - Scala - Scala
Perks And Benefits For Irisians
At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment.
Join us and experience the difference of working at a company that values its employees' success and happiness.
Big data
Posted today
Job Viewed
Job Description
Role & responsibilities
Responsibilities
A day in the life of an Infoscion
•As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction.
•You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain.
•You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews.
•You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes.
•You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you
Additional Responsibilities:
- Knowledge of more than one technology
•Basics of Architecture and Design fundamentals
•Knowledge of Testing tools
•Knowledge of agile methodologies
•Understanding of Project life cycle activities on development and maintenance projects
•Understanding of one or more Estimation methodologies, Knowledge of Quality processes
•Basics of business domain to understand the business requirements
•Analytical abilities, Strong Technical Skills, Good communication skills
•Good understanding of the technology and domain
•Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods
•Awareness of latest technologies and trends
•Excellent problem solving, analytical and debugging skills
Technical and Professional Requirements:
- Primary skills:Technology->Functional Programming->Scala - Bigdata
Preferred Skills:
Technology->Functional Programming->Scala
Preferred candidate profile
Big Data Engineer
Posted 7 days ago
Job Viewed
Job Description
**Responsibilities:**
+ Design, development of BigData applications/ pipelines using Spark, Scala, SQL, Pyspark, Python, Java
+ Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems
+ Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
**Qualifications:**
+ 4-8 years of experience in software development, building large scale distributed data processing systems or large-scale applications
+ Designing & developing Big Data solutions with at least one end to end implementation.
+ Strong Hands-on experience in following technical skills: Apache Spark, Scala/ Java, XML/ JSON/ Parquet/ Avro, SQL, Linux, Hadoop Ecosystem (HDFS, Spark, Impala, HIVE, HBASE etc.), Kafka.
+ Performance analysis, troubleshooting and issue resolution and Exposure to latest Cloudera offerings like Ozone, Iceberg.
+ Intermediate level experience in Applications Development role
+ Consistently demonstrates clear and concise written and verbal communication
+ Demonstrated problem-solving and decision-making skills
+ Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements
**Education:**
+ Bachelor's degree/University degree or equivalent experience
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
---
**Job Family Group:**
Technology
---
**Job Family:**
Applications Development
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
Be The First To Know
About the latest Big data Jobs in India !
Big Data Engineer

Posted 8 days ago
Job Viewed
Job Description
**Responsibilities:**
+ Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements
+ Identify and analyze issues, make recommendations, and implement solutions
+ Utilize knowledge of business processes, system processes, and industry standards to solve complex issues
+ Analyze information and make evaluative judgements to recommend solutions and improvements
+ Conduct testing and debugging, utilize script tools, and write basic code for design specifications
+ Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures
+ Develop working knowledge of Citi's information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications
+ Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
**Qualifications:**
+ 3 to 5 years of relevant experience
+ Experience in programming/debugging used in business applications
+ Working knowledge of industry practice and standards
+ Comprehensive knowledge of specific business area for application development
+ Working knowledge of program languages
+ Consistently demonstrates clear and concise written and verbal communication
**Education:**
+ Bachelor's degree/University degree or equivalent experience
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
Additional Job Description
We are looking for a Big Data Engineer that will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.
Responsibilities
- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities
- Implementing data wrangling, scarping, cleaning using both Java or Python
Strong experience on data structure.
Skills and Qualifications
- Proficient understanding of distributed computing principles
- Proficient in Java or Python and some part of machine learning
- Proficiency with Hadoop v2, MapReduce, HDFS, Pyspark, Spark
- Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming
- Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
- Experience with Spark
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O
- Good understanding of Lambda Architecture, along with its advantages and drawbacks
- Experience with Cloudera/MapR/Hortonworks
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
---
**Job Family Group:**
Technology
---
**Job Family:**
Applications Development
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
Big Data Developer
Posted today
Job Viewed
Job Description
We are seeking a Big Data Engineer with strong hands-on experience in Spark and AWS technologies. The ideal candidate should demonstrate a deep understanding of big data concepts, programming fundamentals, and the ability to solve complex problems related to scalability, failure handling, and optimization.
Key Responsibilities:
- Design, develop, and optimize big data pipelines using Spark on AWS .
- Implement scalable and fault-tolerant data processing solutions.
- Troubleshoot and resolve performance bottlenecks in big data workflows.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Write clean, efficient, and well-documented code following core programming principles.
- Continuously improve existing data infrastructure for better reliability and performance.
Required Skills & Experience:
- Strong practical experience with Apache Spark and big data ecosystems.
- Hands-on experience with AWS services relevant to big data (e.g., EMR, S3, Lambda).
- Solid understanding of core programming fundamentals, including Object-Oriented Programming (OOP) concepts.
- Proven problem-solving skills related to scaling, failure handling, and performance optimization in big data environments.
- Ability to explain not just what technologies are used, but why and how they work.
- Familiarity with common big data terms and best practices.
Big Data Developer
Posted today
Job Viewed
Job Description
Job Title: Big Data Engineer
Location: Pune (Hybrid)
Experience - 5-7 YOE
Salary Range: ₹35–40 LPA
About the Role
We are looking for a Big Data Engineer to design, build, and scale high-performance data pipelines that power our products and insights. You’ll be working closely with product managers, architects, and engineering leads to define technical strategies and ensure the availability, reliability, and quality of data across the organization. This is a high-impact individual contributor role with ownership of critical data components and the opportunity to shape our evolving data platform.
Key Responsibilities
- Design, build, and maintain robust data pipelines (batch and streaming) from diverse data sources.
- Ensure high data quality, reliability, and availability throughout the pipeline lifecycle.
- Collaborate with cross-functional teams to define technical strategy and deliver data-driven solutions.
- Participate in code reviews, testing, and deployment to uphold engineering best practices.
- Own and manage smaller components of the data platform with end-to-end responsibility.
- Identify and resolve performance bottlenecks to optimize data pipelines.
- Proactively explore and adopt new technologies, contributing as a senior individual contributor across multiple products and features.
Required Qualifications
- 5–7 years of experience in Big Data or Data Engineering roles.
- Strong programming skills in Java or Scala (Python acceptable with solid Big Data experience).
- Hands-on experience with distributed processing and streaming frameworks such as Apache Spark, Kafka, Flink .
- Experience with orchestration tools like Airflow (or equivalent).
- Familiarity with cloud platforms (AWS, GCP, Azure ) and services such as S3, Glue, BigQuery, EMR .
- Solid understanding of data structures, algorithms, and object-oriented programming.
- Proven ability to write clean, efficient, and maintainable code.
Tooling & Ecosystem
- Proficiency with version control (e.g., Git) and CI/CD tools.
- Experience with data orchestration frameworks (Airflow, Dagster, etc.).
- Understanding of common file formats: Parquet, Avro, ORC, JSON .
- Basic exposure to containerization (Docker ) and infrastructure-as-code (Terraform a plus).