1372 Data Professionals jobs in Delhi
Data Engineer
Posted today
Job Viewed
Job Description
We’re Hiring: Data Engineer
Work Type: 100% Remote
Experience: 6+ Years
Notice Period: Immediate joiners preferred; up to 15 days acceptable
Summary
Seeking a Data Engineer to design, build, and maintain scalable data pipelines that automate ingestion and transformation from multiple data sources, ensuring reliable data flows for analytics and reporting.
Key Responsibilities
- Design, develop, and operate automated data ingestion and transformation pipelines.
- Integrate multiple data sources and deliver data for downstream use.
- Implement pipeline observability, quality checks, and optimization to improve reliability and reduce manual effort.
- Collaborate with analytics teams to ensure pipelines support automated reporting .
- Recommend process automation and data best practices across the lifecycle.
Must Have Skills
- Proven experience with AWS Glue, Matillion, PySpark, Databricks, and SQL for pipeline design, development, and maintenance.
- Experience integrating data from diverse systems and services.
- Strong data modeling and performance tuning for scalable ETL/ELT workloads (aligned to automated ingestion/transformation goals).
Why Join Us?
- 100% remote work flexibility.
- Opportunity to work on cutting-edge data engineering projects.
- Collaborative and innovative work culture.
- Compensation: Competitive and commensurate with skills and experience — no limits for exceptional talent.
- Apply now and be part of a team that’s redefining data-driven solutions.
Interested candidates may share their resumes at
Data Engineer
Posted today
Job Viewed
Job Description
Your potential, unleashed.
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Data Engineer
Experience: 3+ Years
Skills
- Design, develop, and maintain efficient and scalable
- ETL/ELT data pipelines using Python or PySpark.
- Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into technical solutions.
- Perform data cleansing, transformation, and validation to ensure data quality and integrity.
- Optimize and troubleshoot performance issues in data processing jobs.
- Implement data integration solutions for various sources including databases, APIs, and file systems.
- Participate in code reviews, testing, and deployment processes.
- Maintain proper documentation for data workflows, systems, and best practices.
Qualifications:
- Bachelor’s degree in computer science, Engineering, or a related field.
- 3 to 5 years of hands-on experience as a Data Developer
- Proficient in Python and/or PySpark for data processing.
- Experience working with big data platforms such as Hadoop, Spark, or Databricks.
- Strong understanding of relational databases and SQL.
- Familiarity with data warehousing concepts and tools(e.g., Snowflake, Redshift, BigQuery) is a plus.
- Knowledge of cloud platforms (AWS, Azure, or GCP) is an advantage.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Data Engineer
Posted today
Job Viewed
Job Description
About Kuoni Tumlare
At Kuoni Tumlare , we deliver truly inspiring and innovative solutions and experiences that create value both for our Partners and Society at large. Our wide portfolio of products and solutions is built on 100+ years of destination management experience.
Our solutions include series tours, technical visits, educational tours, Japan specialist travel consulting, as well as meetings, incentives, conferences, and exhibitions. Our product portfolio includes MyBus excursions at destinations as well as guaranteed departure tours devised and delivered by our Seat-in-Coach specialists, Europamundo (EMV) and MyBus Landcruise.
We cater to a wide range of customer needs in close collaboration with our trusted suppliers and powered by our team of destinations experts - enabling us to make a real difference to the world.
About the Business / Function
Proudly part of Kuoni Tumlare, TUMLARE SOFTWARE SERVICES (P) LTD. is a multinational technology support company that serves as a trusted technology partner for businesses since 1999. We also help established brands reimagine their business through digitalization.
We are looking for a Senior Data Engineer with expertise in data modelling, data warehousing, data integration, and data analytics. This position will include supporting and enhancing existing data integration processes. The role involves architecting how structured and unstructured data will be stored, consumed, integrated, and reported by different systems across internal and external data sources. The Senior Data Engineer will work towards building an enterprise model, a central dictionary of common business vocabulary, and defining the approach and principles for data quality management and data integration. Familiarity with Agile methodology and processes is also required.
Job Description:
- Collaborate with the team and product managers to design and build data solutions, identifying potential issues in a timely manner.
- Engineer and implement extract, transform, and load (ETL) processes using Talend and PL/SQL.
- Define the strategy and architecture required to integrate data across multiple systems and improve the existing data warehouse architecture.
- Implement and optimize database design to support performance, scaling, and security requirements.
- Use industry and domain best practices and methodologies to implement Enterprise Data warehouse and marts.
- Create and enforce technology-specific guidelines, standards, policies, and procedures.
- Work closely with application developers and data analysts to design and optimize data access, query, reporting, and analysis strategies.
- Migrate jobs from Talend Open Studio to Talend Cloud, troubleshooting and supporting issues during the migration.
- Maintain and troubleshoot existing jobs for performance.
- Write and maintain technical documentation around the jobs and standards
Job Requirements:
At least 5 years of experience developing ETL or ELT solutions.
Strong expertise in data modelling, data warehousing, data integration, and data analytics.
roficiency in Talend and PL/SQL.
xperience with Agile methodology and processes.
xcellent problem-solving skills and attention to detail.
trong communication and collaboration abilities.
Preferred Certifications:
ertified Analytics Professional (CAP)
ertified Data Management Professional (CDMP)
oogle Data Analytics Professional Certificate
ssociate Certified Analytics Professional (aCAP)
- Candidate should be based in Delhi NCR
- Availability to join - 0 to 15 days
We Are Looking for a Person Who Is:
- A team player, willing to get involved in broader issues, with a key focus on solving the requirements.
- A collaborative self-starter with hands-on experience and a can-do attitude. A pragmatic approach and the ability to address and solve challenges within a dynamic global environment.
- Having a pragmatic approach and the ability to address and solve challenges within a dynamic global environment.
- Focusing on accuracy and details while working towards multiple deadlines.
- Open-minded and with positive attitude, but also critically challenging existing processes and practices.
- A disciplined thinker and analytical problem solver who has the capacity to manage complex issues and develop effective solutions in a timely fashion.
What We Offer:
- Working in one of the world’s leading multinational company.
- Probation period - only 3 months.
- Annual Bonus – as per company policy.
- Long Service Award.
- Paid leaves for Birthday and Wedding/Work Anniversary
- Learning Opportunity through an online learning platform with rich training courses and resources.
- Company Sponsored IT Certification - as per company policy
- Following insurance from Date of Joining:
- Group Medical Insurance with Sum Insured of up to 5 Lakh
- Term life Insurance - 3 times of your CTC
- Accidental Insurance - 3 times of your CTC
- Employee Engagement Activities:
- Fun Friday per week
- Annual Off-Site Team Building
- End Year Party
- CSR programs
- Global Employee Engagement Events
If you match the requirements, excited about what we offer and interested in a new challenge, we are looking forward to receiving your full application.
Job Location - Pitampura, Delhi. 5 days working.
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
We're Hiring: Data Engineer
Work Type:
100% Remote
Experience:
6+ Years
Notice Period:
Immediate joiners preferred; up to 15 days acceptable
Summary
Seeking a Data Engineer to design, build, and maintain scalable data pipelines that automate ingestion and transformation from multiple data sources, ensuring reliable data flows for analytics and reporting.
Key Responsibilities
- Design, develop, and operate automated
data ingestion and transformation
pipelines. - Integrate
multiple data sources
and deliver data for downstream use. - Implement pipeline
observability, quality checks, and optimization
to improve reliability and reduce manual effort. - Collaborate with analytics teams to ensure pipelines support
automated reporting
. - Recommend
process automation
and data best practices across the lifecycle.
Must Have Skills
- Proven experience with
AWS Glue, Matillion, PySpark, Databricks, and SQL
for pipeline design, development, and maintenance. - Experience integrating data from diverse systems and services.
- Strong data modeling and performance tuning for scalable ETL/ELT workloads (aligned to automated ingestion/transformation goals).
Why Join Us?
- 100% remote work flexibility.
- Opportunity to work on cutting-edge data engineering projects.
- Collaborative and innovative work culture.
- Compensation:
Competitive and commensurate with skills and experience —
no limits for exceptional talent. - Apply now
and be part of a team that's redefining data-driven solutions.
Interested candidates may share their resumes
Data Engineer
Posted today
Job Viewed
Job Description
Job description:
We're looking for a hands-on Data Engineer to manage and scale our data scraping pipelines across 60+ websites. The job involves handling OCR-processed PDFs, ensuring data quality, and building robust, self-healing workflows that fuel AI-driven insights.
You'll Work On:
Managing and optimizing Airflow scraping DAGs
Implementing validation checks, retry logic & error alerts
Cleaning and normalizing OCR text (Tesseract / AWS Textract)
Handling deduplication, formatting, and missing data
Maintaining MySQL/PostgreSQL data integrity
Collaborating with ML engineers on downstream pipelines
What You Bring:
2–5 years of hands-on experience in Python data engineering
Experience with Airflow, Pandas, and OCR tools
Solid SQL skills and schema design (MySQL/PostgreSQL)
Comfort with CSVs and building ETL pipelines
Required:
Scrapy or Selenium experience
CAPTCHAs handling
Experience in PyMuPDF, Regex
AWS S3
LangChain, LLM, Fast API
Streamlit
Matplotlib
Job Type: Full-time
Day shift
Work Location: In person
Job Type: Full-time
Pay: ₹70, ₹150,000.00 per month
Application Question(s):
Total years of experience in web scraping / data extraction
Have you worked with large-scale data pipelines?
Are you proficient in writing complex Regex patterns for data extraction and cleaning?
Have you implemented or managed data pipelines using tools like Apache Airflow?
Years of experience with PDF Parsing and using OCR tools (e.g., Tesseract, Google Document AI, AWS Textract, etc.)
- Years of experience handling complex PDF tables with merged rows, rotated layouts, or inconsistent formatting
Are you willing to relocate to Delhi if selected?
Current CTC
Expected CTC
Work Location: In person
Data Engineer
Posted today
Job Viewed
Job Description
Who is ERM?
ERM
is a leading global sustainability consulting firm, committed for nearly 50 years to helping organizations navigate complex environmental, social, and governance (ESG) challenges. We bring together a diverse and inclusive community of experts across regions and disciplines, providing a truly multicultural environment that fosters collaboration, professional growth, and meaningful global exposure. As a people-first organization, ERM values well-being, career development, and the power of collective expertise to drive sustainable impact for our clients—and the planet.
Introducing our new Global Delivery Centre (GDC)
Our
Global Delivery Centre (GDC)
in India is a unified platform designed to deliver high-value services and solutions to ERM's global clientele. By centralizing key business and consulting functions, we streamline operations, optimize service delivery, and enable our teams to focus on what matters most—advising clients on sustainability challenges with agility and innovation. Through the GDC, you will collaborate with international teams, leverage emerging technologies, and further enhance ERM's commitment to excellence—amplifying our shared mission to make a lasting, positive impact.
Job Objective
The objective of the Data Engineer role is to help our consultants upskill in the use of new technology and tools, especially AI tools. The Data Engineer will work with project teams to demonstrate and showcase capabilities using ERM's technology and AI tools, and impart that knowledge to the project teams so they can undertake these tasks themselves. Additionally, the role involves creating tools and resources for project teams to independently achieve their goals. The overall objective is to facilitate the growth of capabilities within our consulting ranks.
Key Accountabilities & Responsibilities
- Work directly with project teams to solve their problems using available tools, ensuring that project teams can independently and autonomously solve these problems in the future.
- Identify generic forms of requests to build tools and applications that consultants can use off the shelf for similar purposes in the future.
- Understand requirements and own the execution of tasks while collaborating across the business.
- Demonstrate and showcase capabilities using ERM's technology and AI tools, such as Microsoft Fabric and Copilot.
- Create tools and resources to support project teams in achieving their goals independently.
Influence And Decision Making Authority
- Influence: The Data Engineer will have significant influence over the adoption and implementation of new technologies and AI tools within project teams. They will guide and mentor consultants, helping them to upskill and become proficient in using these tools independently.
- Decision Making Authority: The Data Engineer will have the authority to make decisions regarding the design and development of tools and resources that will be used by project teams. They will also be responsible for identifying common problems and creating reusable solutions that can be applied across different projects. Additionally, they will collaborate with various departments to ensure that the tools and solutions developed align with the overall business strategy and objectives.
Job Requirements & Capabilities
Qualifications:
- Bachelors degree qualified in science, engineering or mathematics
Job specific capabilities/skills:
- Skills in data engineering and working with databases.
- Experience in data science and coding in Python is favorable.
- Experience with Microsoft Fabric highly regarded.
- Experience with large language models (LLMs) or other natural language processing tools.
- Ability to work with non-technical specialists to upskill or train them.
- Strong communication skills and the ability to articulate complex scenarios effectively.
- Ability to work in a complex, global, dynamic organization and be effective within matrixed reporting environments and multi-partner contexts.
- Problem-solving skills and the ability to make decisions by assessing situations and selecting appropriate courses of action.
Data Engineer
Posted today
Job Viewed
Job Description
Responsibilities:
* Design, develop & maintain data pipelines using Azure Data Factory, Databricks & Synapse.
* Collaborate with cross-functional teams on ETL processes with Python, SQL & Spark.
Provident fund
Work from home
Be The First To Know
About the latest Data professionals Jobs in Delhi !
Data Engineer
Posted today
Job Viewed
Job Description
Your potential, unleashed.
India's impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Data Engineer
Experience:
3+ Years
Skills
- Design, develop, and maintain efficient and scalable
- ETL/ELT data pipelines using Python or PySpark.
- Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into technical solutions.
- Perform data cleansing, transformation, and validation to ensure data quality and integrity.
- Optimize and troubleshoot performance issues in data processing jobs.
- Implement data integration solutions for various sources including databases, APIs, and file systems.
- Participate in code reviews, testing, and deployment processes.
- Maintain proper documentation for data workflows, systems, and best practices.
Qualifications:
- Bachelor's degree in computer science, Engineering, or a related field.
- 3 to 5 years of hands-on experience as a Data Developer
- Proficient in Python and/or PySpark for data processing.
- Experience working with big data platforms such as Hadoop, Spark, or Databricks.
- Strong understanding of relational databases and SQL.
- Familiarity with data warehousing concepts and tools(e.g., Snowflake, Redshift, BigQuery) is a plus.
- Knowledge of cloud platforms (AWS, Azure, or GCP) is an advantage.
How you'll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world's most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report.
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone's welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here's a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you're applying to. Check out recruiting tips from Deloitte professionals.
Data Engineer
Posted today
Job Viewed
Job Description
Job Description:
We are looking for a skilled
Data Engineer
with strong expertise in data integration, ETL pipelines, and cloud infrastructure. The ideal candidate will be proficient in SQL, Python, and MongoDB, with hands-on experience in building scalable data pipelines and working across multiple databases. The role requires a platform-agnostic mindset with exposure to AWS services, messaging systems, and monitoring tools. The selected candidate will be working
at our client site in Delhi,
and this is a
Work From Office (WFO)
opportunity.
Key Responsibilities:
- Design, develop, and maintain
ETL pipelines
and database schemas to support business and analytics needs. - Work with
multi-database architectures
(SQL, NoSQL, MongoDB) ensuring scalability and efficiency. - Deploy and manage
AWS resources
such as Lambda functions and EC2 instances. - Integrate and optimize
streaming/messaging frameworks
such as Kafka and caching systems like Redis. - Collaborate with cross-functional teams to ensure seamless data flow across platforms.
- Monitor infrastructure and system performance using tools such as
Grafana, CloudWatch, or equivalent monitoring solutions
. - Ensure data quality, security, and compliance standards are consistently maintained.
Required Skills & Experience:
- Strong programming experience in
SQL, Python, and MongoDB
. - Proven experience in building and managing
ETL pipelines
. - Ability to work in a
platform-agnostic
environment. - Hands-on experience with
AWS services
(Lambda, EC2). - Exposure to
Kafka / Redis
. - Experience with monitoring tools (
Grafana, CloudWatch, etc.
). - Strong problem-solving skills and ability to work in a fast-paced environment.
Data Engineer
Posted today
Job Viewed
Job Description
Company Description
Hidden Road is the global credit network for institutions, enabling seamless access to both traditional and digital markets. Our conflict-free model built on a modern technology stack helps eliminate complexity and reduce costs in prime brokerage, clearing, and financing. For more information, please email
Role Description
This is a full-time, on-site role for a Data Engineer located in New Delhi. The Data Engineer will be responsible for designing, developing, and maintaining data pipelines, data modeling, implementing ETL processes, and managing data warehousing solutions. The role involves working closely with data analytics teams to ensure data is accurate and available for analytical and operational uses.
Qualifications
- Experience in Data Engineering and Data Modeling
- Proficiency in Extract Transform Load (ETL) processes and Data Warehousing
- Strong skills in Data Analytics
- Ability to work well in a collaborative, team environment
- Excellent problem-solving and communication skills
- Experience with big data technologies is a plus
- Bachelor's degree in Computer Science, Engineering, or related field