What Jobs are available for Data Engineer in Delhi?

Showing 15 Data Engineer jobs in Delhi

Data Engineer

Delhi, Delhi Tata Consultancy Services

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

Required Information

Role**

Microsoft Azure Data Engineer

Required Technical Skill Set**

SQL, ADF, ADB, ETL/Data background


Desired Experience Range 4

Location of Requirement

India


Desired Competencies (Technical/Behavioral Competency)

Must-Have**

(Ideally should not be more than 3-5)

Strong handson with Azure Data Factory (ADF), Azure Databricks, ADLS, SQL, ETL/ELT Pipelines – building, orchestrating, and optimizing data pipelines. DevOps (version control (Git)

Good-to-Have

Water industry domain knowledge


SN

Responsibility of / Expectations from the Role

1

Deliver clean, reliable and scalable data pipelines

2

Ensure data availability and quality

3

Excellent communication and documentation abilities

4

Strong analytical skil

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Principal Data Engineer

New Delhi, Delhi Autodesk

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

**Job Requisition ID #**
25WD91911
**Position Overview**
Autodesk is seeking a Principal Data Engineer to lead the design and development of data architecture, pipeline for our data team. In this role, you will shape the future of our data ecosystem, driving innovation across data pipelines, architecture, and cloud platforms. You'll partner closely with analysts, data scientists, AI/ML Engineers and product teams to deliver scalable solutions that power insights and decision-making across the company.
This is an exciting opportunity for a principal data engineer who thrives on solving complex problems, driving best practices, and mentoring high-performing teams.
**Responsibilities**
+ Lead and mentor a team of data engineers responsible for building and maintaining scalable data pipelines and infrastructure on AWS, Snowflake and Azure
+ Architect and implement end-to-end data pipeline solutions, ensuring high performance, resilience, and cost efficiency across both batch and real-time data flows
+ Define and drive the long-term vision for data engineering in alignment with Autodesk's data platform strategy and analytics roadmap
+ Collaborate with analysts, data scientists, FinOps engineers, and product/engineering teams to translate business needs into reliable, scalable data solutions
+ Establish and enforce standards for data quality, governance, observability, and operational excellence, defining "what good looks like" across the data lifecycle
+ Design and optimize data models, ELT/ETL processes, and data architectures to support analytics, BI, and machine learning workloads
+ Best practices in CI/CD, testing frameworks, and deploying data pipelines
+ Leverage modern data integration tools such as Fivetran, Nexla and Airflow to batch ingestion and transformation workflows
+ Apply AI-driven approaches for anomaly detection, pipeline optimization, and automation
+ Stay current with emerging trends in data engineering and proactively evolve the team's capabilities and toolset
**Minimum Qualifications**
+ 10+ years of experience in data engineering, with at least 3 years in a lead role
+ Demonstrated success in delivering large-scale, enterprise-grade data pipeline architectures and leading technical teams
+ Expertise with cloud data platforms AWS and Azure experience is a strong plus
+ Proficiency in SQL, Python, and modern data modeling practices
+ Hands-on experience with batch and streaming frameworks (e.g., Spark, Kafka, Kinesis, Hadoop)
+ Proven track record of building and maintaining real-time and batch data pipelines at scale
+ Deep understanding of ETL and ELT paradigms, including traditional ETL and modern ELT tools
+ Experience with data integration tools (Fivetran, Nexla, etc.) and orchestration platforms
+ Familiarity with Data Lakehouse architectures, data mesh concepts, and hybrid/multi-cloud strategies
+ Strong communication, leadership, and stakeholder management skills
+ Ability to drive scalable architecture decisions through platform systems design and modern engineering patterns
#LI-NB1
**Learn More**
**About Autodesk**
Welcome to Autodesk! Amazing things are created every day with our software - from the greenest buildings and cleanest cars to the smartest factories and biggest hit movies. We help innovators turn their ideas into reality, transforming not only how things are made, but what can be made.
We take great pride in our culture here at Autodesk - it's at the core of everything we do. Our culture guides the way we work and treat each other, informs how we connect with customers and partners, and defines how we show up in the world.
When you're an Autodesker, you can do meaningful work that helps build a better world designed and made for all. Ready to shape the world and your future? Join us!
**Salary transparency**
Salary is one part of Autodesk's competitive compensation package. Offers are based on the candidate's experience and geographic location. In addition to base salaries, our compensation package may include annual cash bonuses, commissions for sales roles, stock grants, and a comprehensive benefits package.
**Diversity & Belonging**
We take pride in cultivating a culture of belonging where everyone can thrive. Learn more here: you an existing contractor or consultant with Autodesk?**
Please search for open jobs and apply internally (not on this external site).
Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

AWS Data Engineer

Delhi, Delhi Tata Consultancy Services

Posted today

Job Viewed

Tap Again To Close

Job Description

Role ** - AWS Data Engineer

Technical Skill Set -Aws data engineer having strong experience of Python

Experience Range -6 to 8


Technical/Behavioral Competency


1. Proficient in Python, with experience in deploying Python packages and OOP, Experience in ingesting data from different data sources (APIs, Web scraping, Flat-files, Databases)

2. hands-on experience in designing ETL processes, utilizing various architectures, orchestration tools (e.g. airflow), and data quality testing.

3. experience with Snowflake, dbt and data warehousing, experience with AWS Cloud Services, including Lambda, S3, DynamoDB and Glue.

4. Python for data engineering and strong SQL development skills.

5. Proven track record of software development . Ability to maintain CI/CD processes and drive continuous improvements.

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Azure Data Engineer

Delhi, Delhi Tata Consultancy Services

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

Role - Azure Data Engineer

Required Technical Skill Set

Azure Data Factory + Azure Databricks.

Desired Experience Range - 8 TO 12 years

Location of Requirement

Mumbai, Bangalore, Hyderabad, Chennai, Gurgaon, Pune.


Desired Competencies (Technical/Behavioral Competency)

Must-Have

Azure Data Factory, Azure Databricks

Good-to-Have

Python/Pyspark


Responsibility of / Expectations from the Role

1 Dev Implementing highly performant, scalable and re-usable data ingestion and transformation pipelines across Azure components and services

2 Dev Strong experience using Logic Apps, Functions & Data Factory

3 Dev Strong experience of development using Microsoft Azure as well as experience of developing integrations for large, complex pieces of software.

4 Dev Strong understanding of the technical side of CI / CD

5 Dev Strong understanding of Agile

6 Des Designing and implementing a data warehouse on Azure using Azure HDInsight, Azure Data Factory, ADLS, Databricks, SQL Server, SQL DWH, Analytics Service, Event Hubs, KeyVault and other Azure services

7 Des Designing, orchestrating and implementing highly performant, scalable and re-usable data ingestion and transformation pipelines across Azure components and services

8 Des Designing and implementing Event based and streaming data ingestion and processing using Azure PaaS services

9 Des Data ingestion, data engineering and/or data curation using native Azure services or tools available in Azure Marketplace

10 Des Designing and implementing data governance, data cataloguing and data lineage solutions using tools such as Azure Data Catalog/Informatica Data Catalog

11 Des Developing physical data models in MongoDB, SQL DWH , .etc.

12 Des Developing APIs

13 Des Have worked in agile delivery teams with DevOps ways of working to continuously deliver iterative deployments with experience in using Jira, Git Repositories, Confluence, or similar

14 Des Have had experience in data migration activities in the past including migration strategy and approach, source and target system discovery, analysis, mapping, development and reconciliation

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Delhi, Delhi Baazi Games

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

As a Data Engineer at Baazi Games, you will be focused on delivering data-driven insights to various functional teams enabling them to make strategic decisions that add value to the top or bottom line of the business.


What you will do


● Design, build and own all the components of a high-volume data hub.

● Build efficient data models using industry best practices and metadata for ad hoc and pre-built reporting.

● Interface with business customers, gathering requirements and delivering complete data solutions & reporting.

● Work on solutions owning the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.

● Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.

● Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources.

● Own the functional and non-functional scaling of software systems in your area.

● Provides input and recommendations on technical issues to BI Engineers, Business & Data Analysts, and Data Scientists.


What we are looking for

● 4-7 years of experience in data engineering.

● Strong understanding of ETL concepts and experience building them with large-scale, complex datasets using distributed computing technologies.

● Strong data modelling skills with solid knowledge of various industry standards such as dimensional modelling, star schemas etc.

● Extremely proficient in writing performant SQL working with large data volumes.

● Experience designing and operating very large Datalakes/Data Warehouses

● Experience with scripting for automation (e.g., UNIX Shell scripting, Python).

● Good to have experience working on the AWS stack

● Clear thinker with superb problem-solving skills to prioritize and stay focused on big needle movers.

● Curious, self-motivated & a self-starter with a ‘can-do attitude’. Comfortable working in a fast-paced dynamic environment.


Key technologies

● Must have excellent knowledge of Advanced SQL working with large data sets.

● Must have knowledge of Apache Spark.

● Should be proficient with any of the following languages: Java/Scala/Python.

● Must have knowledge of working with Apache Airflow or Nifi.

● Should be comfortable with any of the MPP querying engines like Impala, Presto or Athena.

● Good to have experience with AWS technologies including Redshift, RDS, S3, EMR, Glue, Athena etc.

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

GCP Data Engineer

Delhi, Delhi Tata Consultancy Services

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title : GCP Data Engineer

Job Location – Chennai / Hyderabd / Bangalore / Pune / Gurgoan/ Noida / NCR

Experience: 5 to 10 Years of experience in IT industry in Planning, deploying, and configuring GCP based solutions.

Requirement:

  • Mandatory to have knowledge of Big Data Architecture Patterns and experience in delivery of BigData and Hadoop Ecosystems.
  • Strong experience required in GCP . Must have done multiple large projects with GCP Big Query and ETL
  • Experience working in GCP based Big Data deployments (Batch/Realtime) leveraging components like GCP Big Query, air flow, Google Cloud Storage, Data fusion, Data flow, Data Proc etc
  • Should have experience in SQL/Data Warehouse
  • Expert in programming languages like Java, Hadoop, Scala
  • Expert in at least one distributed data processing frameworks: like Spark (Core, Streaming , SQL), Storm or Flink etc.
  • Should have worked on any of Orchestration tools – Oozie , Airflow , Ctr-M or similar, Kubernetes.
  • Worked on Performance Tuning, Optimization and Data security
  • Preferred Experience and Knowledge:
  • Excellent understanding of data technologies landscape / ecosystem.
  • Good Exposure in development with CI / CD pipelines. Knowledge of containerization, orchestration and Kubernetes engine would be an added advantage.
  • Well versed with pros and cons of various database technologies like Relational, BigQuery, Columnar databases, NOSQL
  • Exposure in data governance, catalog, lineage and associated tools would be an added advantage.
  • Well versed with SaaS, PaaS and IaaS concepts and can drive clients to a decisions
  • Good skills in Python Language and PYSPARK

Key word:

GCP , BigQuery , Python, Pyspark

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

GCP Data engineer

Delhi, Delhi LTIMindtree

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in Delhi !

ETL Data Engineer

Delhi, Delhi The Techgalore

Posted 26 days ago

Job Viewed

Tap Again To Close

Job Description

remote

Pls rate the candidate (from 1 to 5, 1 lowest, 5 highest ) in these areas 

  1. Big Data
  2. PySpark
  3. AWS
  4. Redshift

Position Summary

Experienced ETL Developers and Data Engineers to ingest and analyze data from multiple enterprise sources into Adobe Experience Platform

 Requirements 

  • About 4-6 years of professional technology experience mostly focused on the following: 
  •  4+ year of experience on developing data ingestion pipelines using Pyspark(batch and streaming).
  • 4+ years experience on multiple Data engineering related services on AWS, e.g. Glue, Athena, DynamoDb, Kinesis, Kafka, Lambda, Redshift etc.
  •   1+ years of experience of working with Redshift esp the following.

o   Experience and knowledge of loading data from various sources, e.g. s3 bucket and on-prem data sources into Redshift.

o   Experience of optimizing data ingestion into Redshift.

o   Experience of designing, developing and optimizing queries on Redshift using SQL or PySparkSQL

o   Experience of designing tables in Redshift(distribution key, compression etc., vacuuming,etc. ) 

  Experience of developing applications that consume the services exposed as ReST APIs.   Experience and ability to write and analyze complex and performant SQLs

Special Consideration given for  

  • 2 years of Developing and supporting ETL pipelines using enterprise-grade ETL tools like Pentaho, Informatica, Talend
  • Good knowledge on Data Modellin g(design patterns and best practices)
  •   Experience with Reporting Technologies (i.e. Tableau, PowerBI)

What youll do

  Analyze and understand customers use case and data sources and extract, transform and load data from multitude of customers enterprise sources and ingest into Adobe Experience Platform

  Design and build data ingestion pipelines into the platform using PySpark

  Ensure ingestion is designed and implemented in a performant manner to support the throughout and latency needed.

  Develop and test complex SQLs to extractanalyze and report the data ingested into the Adobe Experience platform.

  Ensure the SQLs are implemented in compliance with the best practice to they are performant.

  Migrate platform configurations, including the data ingestion pipelines and SQL, across various sandboxes.

  Debug any issues reported on data ingestion, SQL or any other functionalities of the platform and resolve the issues.

  Support Data Architects in implementing data model in the platform.

  Contribute to the innovation charter and develop intellectual property for the organization.

  Present on advanced features and complex use case implementations at multiple forums.  

  Attend regular scrum events or equivalent and provide update on the deliverables.

  Work independently across multiple engagements with none or minimum supervision.



Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Spark / Scala Data Engineer

Delhi, Delhi Tata Consultancy Services

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

Role - Spark / Scala Data Engineer

Experience - 8 to 10 yrs

Location - Bangalore/Chennai/Hyderabad/Delhi/Pune


Must Have- Big Data Hadoop - Hive and Spark/Scala solid experience- SQL advance knowledge - Been able to test changes and issues properly, replicating the code functionality into SQL- Worked with Code Repositories as GIT, Maven, .- DevOps Knowledge (Jenkins, Scripts, .) - Tools used for deploying software into environments, use of Jira.Good to have:- Analyst Skills - Being able to translate technical requirements to non-technical partners and to deliver clear solutions. Been able to create test cases scenarios.- Control-m solid experience - Been able to create jobs, modify parameters- Documentation - Experience of carrying out data and process analysis to create specifications documents- Finance Knowledge - Have a experience working in Financial Services / Banking organization with an understanding of Financial Services / Retail, Business and Corporate Banking- AWS knowledge- Unix / Linux

Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.

Sr. Azure Data Engineer

Delhi, Delhi ALIQAN Technologies

Posted 12 days ago

Job Viewed

Tap Again To Close

Job Description

remote,full-time
Job Overview:

We are urgently looking for a highly skilled and experienced Senior Azure Data Engineer to join our team. The ideal candidate must have a solid background in Azure Data Factory (ADF), Databricks, PySpark, and strong hands-on experience with modern data tools and platforms. This is a client-facing role requiring excellent communication skills, flexibility, and strong problem-solving capabilities.

Key Responsibilities:
  • Design, build, and maintain scalable data pipelines and ETL processes using Azure Data Factory and Databricks
  • Develop and optimize PySpark jobs for large-scale data processing
  • Integrate various data sources into a unified data lake/warehouse environment
  • Implement data governance practices using Unity Catalog and manage metadata effectively
  • Work closely with stakeholders to understand business requirements and translate them into technical solutions
  • Monitor, troubleshoot, and improve existing data workflows and performance
  • Collaborate with BI developers to support Power BI reporting and dashboards
  • Utilize Azure DevOps (ADO) for CI/CD pipelines and version control
Must-Have Skills:
  • Azure Data Factory (ADF) Advanced knowledge
  • Azure Platform End-to-end data engineering experience
  • Databricks & PySpark Strong development experience
  • SQL Excellent query optimization and data manipulation skills
  • Azure DevOps (ADO) Familiar with CI/CD practices
  • Power BI Ability to support data modeling and integration
  • Unity Catalog Hands-on experience in metadata and data governance
  • Excellent Communication Skills Verbal and written, must be client-facing ready
Preferred Qualifications:
  • Azure Data Engineer or Azure Architect Certification
  • Previous experience in Agile environments
  • Exposure to data security and compliance practices in Azure
  • Strong problem-solving and analytical thinking


Is this job a match or a miss?
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineer Jobs View All Jobs in Delhi