21 Data Pipelines jobs in Delhi

Data Integration & Modeling Specialist

New Delhi, Delhi ALIQAN Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Data Integration & Modeling Specialist

Job Type: Contract

Location: Remote

Duration: 6 Months



Job Summary:

We are seeking a highly skilled Data Integration & Modeling Specialist with hands-on experience in developing common metamodels, defining integration specifications, and working with semantic web technologies and various data formats. The ideal candidate will bring deep technical expertise and a collaborative mindset to support enterprise-level data integration and standardization initiatives.


Key Responsibilities:

Develop common metamodels by integrating requirements across diverse systems and organizations.


Define integration specifications, establish data standards, and develop logical and physical data models.


Collaborate with stakeholders to align data architectures with organizational needs and industry best practices.


Implement and govern semantic data solutions using RDF and SPARQL.


Perform data transformations and scripting using TCL, Python, and Java.


Work with multiple data formats including FRL, VRL, HRL, XML, and JSON to support integration and processing pipelines.


Document technical specifications and provide guidance on data standards and modeling best practices.


Required Qualifications:

3+ years of experience (within the last 8 years) in developing common metamodels, preferably using NIEM standards.


3+ years of experience (within the last 8 years) in:


Defining integration specifications


Developing data models


Governing data standards


2+ years of recent experience with:


Tool Command Language (TCL)


Python


Java


2+ years of experience with:


Resource Description Framework (RDF)


SPARQL Query Language


2+ years of experience working with:


Fixed Record Layout (FRL)


Variable Record Layout (VRL)


Hierarchical Record Layout (HRL)


XML


JSONodeling Specialist

This advertiser has chosen not to accept applicants from your region.

Data Integration & Modeling Specialist

Delhi, Delhi ALIQAN Technologies

Posted 13 days ago

Job Viewed

Tap Again To Close

Job Description

contract

Job Title: Data Integration & Modeling Specialist

Job Type: Contract

Location: Remote

Duration: 6 Months



Job Summary:

We are seeking a highly skilled Data Integration & Modeling Specialist with hands-on experience in developing common metamodels, defining integration specifications, and working with semantic web technologies and various data formats. The ideal candidate will bring deep technical expertise and a collaborative mindset to support enterprise-level data integration and standardization initiatives.


Key Responsibilities:

Develop common metamodels by integrating requirements across diverse systems and organizations.


Define integration specifications, establish data standards, and develop logical and physical data models.


Collaborate with stakeholders to align data architectures with organizational needs and industry best practices.


Implement and govern semantic data solutions using RDF and SPARQL.


Perform data transformations and scripting using TCL, Python, and Java.


Work with multiple data formats including FRL, VRL, HRL, XML, and JSON to support integration and processing pipelines.


Document technical specifications and provide guidance on data standards and modeling best practices.


Required Qualifications:

3+ years of experience (within the last 8 years) in developing common metamodels, preferably using NIEM standards.


3+ years of experience (within the last 8 years) in:


Defining integration specifications


Developing data models


Governing data standards


2+ years of recent experience with:


Tool Command Language (TCL)


Python


Java


2+ years of experience with:


Resource Description Framework (RDF)


SPARQL Query Language


2+ years of experience working with:


Fixed Record Layout (FRL)


Variable Record Layout (VRL)


Hierarchical Record Layout (HRL)


XML


JSONodeling Specialist

This advertiser has chosen not to accept applicants from your region.

ETL Developer

Delhi, Delhi IntraEdge

Posted 17 days ago

Job Viewed

Tap Again To Close

Job Description

Website-

Job Title: ETL Developer – DataStage, AWS, Snowflake

Experience: 5–7 Years

Location: (Remote)

Job Type: (Full-time )

About the Role

We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team. You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data) , AWS Glue , and Snowflake . You will collaborate with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI/ML use cases.


Key Responsibilities

  • Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc.
  • Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic.
  • Participate in code reviews , performance tuning, and defect triage sessions.
  • Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines.
  • Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations.
  • Troubleshoot and resolve issues in QA/UAT/Production environments as needed.
  • Adhere to agile delivery practices, sprint planning, and documentation requirements.

Required Skills and Experience

  • 4+ years of experience in ETL development with at least 1–2 years in IBM DataStage (preferably CP4D version) .
  • Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing.
  • Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning.
  • Proficiency in SQL , Unix scripting , and basic Python for data handling or automation.
  • Familiarity with S3 , version control systems (Git), and job orchestration tools.
  • Experience with data profiling, cleansing, and quality validation routines.
  • Understanding of data lake/data warehouse architectures and DevOps practices.

Good to Have

  • Experience with Collibra, BigID , or other metadata/governance tools
  • Exposure to Data Mesh/Data Domain models
  • Experience with agile/Scrum delivery and Jira/Confluence tools
  • AWS or Snowflake certification is a plus
This advertiser has chosen not to accept applicants from your region.

ETL Developer

New Delhi, Delhi IntraEdge

Posted 17 days ago

Job Viewed

Tap Again To Close

Job Description

Website-

Job Title: ETL Developer – DataStage, AWS, Snowflake

Experience: 5–7 Years

Location: (Remote)

Job Type: (Full-time )

About the Role

We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team. You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data) , AWS Glue , and Snowflake . You will collaborate with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI/ML use cases.


Key Responsibilities

  • Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc.
  • Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic.
  • Participate in code reviews , performance tuning, and defect triage sessions.
  • Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines.
  • Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations.
  • Troubleshoot and resolve issues in QA/UAT/Production environments as needed.
  • Adhere to agile delivery practices, sprint planning, and documentation requirements.

Required Skills and Experience

  • 4+ years of experience in ETL development with at least 1–2 years in IBM DataStage (preferably CP4D version) .
  • Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing.
  • Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning.
  • Proficiency in SQL , Unix scripting , and basic Python for data handling or automation.
  • Familiarity with S3 , version control systems (Git), and job orchestration tools.
  • Experience with data profiling, cleansing, and quality validation routines.
  • Understanding of data lake/data warehouse architectures and DevOps practices.

Good to Have

  • Experience with Collibra, BigID , or other metadata/governance tools
  • Exposure to Data Mesh/Data Domain models
  • Experience with agile/Scrum delivery and Jira/Confluence tools
  • AWS or Snowflake certification is a plus
This advertiser has chosen not to accept applicants from your region.

ETL Developer

Narela, Delhi IntraEdge

Posted 17 days ago

Job Viewed

Tap Again To Close

Job Description

Website-

Job Title: ETL Developer – DataStage, AWS, Snowflake

Experience: 5–7 Years

Location: (Remote)

Job Type: (Full-time )

About the Role

We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team. You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data) , AWS Glue , and Snowflake . You will collaborate with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI/ML use cases.


Key Responsibilities

  • Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc.
  • Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic.
  • Participate in code reviews , performance tuning, and defect triage sessions.
  • Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines.
  • Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations.
  • Troubleshoot and resolve issues in QA/UAT/Production environments as needed.
  • Adhere to agile delivery practices, sprint planning, and documentation requirements.

Required Skills and Experience

  • 4+ years of experience in ETL development with at least 1–2 years in IBM DataStage (preferably CP4D version) .
  • Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing.
  • Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning.
  • Proficiency in SQL , Unix scripting , and basic Python for data handling or automation.
  • Familiarity with S3 , version control systems (Git), and job orchestration tools.
  • Experience with data profiling, cleansing, and quality validation routines.
  • Understanding of data lake/data warehouse architectures and DevOps practices.

Good to Have

  • Experience with Collibra, BigID , or other metadata/governance tools
  • Exposure to Data Mesh/Data Domain models
  • Experience with agile/Scrum delivery and Jira/Confluence tools
  • AWS or Snowflake certification is a plus
This advertiser has chosen not to accept applicants from your region.

ETL Developer

New Delhi, Delhi Pinnacle Group, Inc.

Posted 22 days ago

Job Viewed

Tap Again To Close

Job Description

About PTR Global

PTR Global is a leader in providing innovative workforce solutions, dedicated to optimizing talent acquisition and management processes. Our commitment to excellence has earned us the trust of businesses looking to enhance their talent strategies. We cultivate a dynamic and collaborative environment that empowers our employees to excel and contribute to our clients' success.


Job Summary

We are seeking a highly skilled ETL Developer to join our team in Chennai. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives. This role requires proficiency in T-SQL, Azure Data Factory (ADF), and SSIS, along with excellent problem-solving and communication skills.


Responsibilities

  • Design, develop, and maintain ETL processes to support data integration and business intelligence initiatives.
  • Utilize T-SQL to write complex queries and stored procedures for data extraction and transformation.
  • Implement and manage ETL processes using SSIS (SQL Server Integration Services).
  • Design and model data warehouses to support reporting and analytics needs.
  • Ensure data accuracy, quality, and integrity through effective testing and validation procedures.
  • Collaborate with business analysts and stakeholders to understand data requirements and deliver solutions that meet their needs.
  • Monitor and troubleshoot ETL processes to ensure optimal performance and resolve any issues promptly.
  • Document ETL processes, workflows, and data mappings to ensure clarity and maintainability.
  • Stay current with industry trends and best practices in ETL development, data integration, and data warehousing.


Must Haves

  • Minimum 4+ years of experience as an ETL Developer or in a similar role.
  • Proficiency in T-SQL for writing complex queries and stored procedures.
  • Experience with SSIS (SQL Server Integration Services) for developing and managing ETL processes.
  • Knowledge of ADF (Azure Data Factory) and its application in ETL processes.
  • Experience in data warehouse design and modeling.
  • Knowledge of Microsoft's Azure cloud suite, including Data Factory, Data Storage, Blob Storage, Power BI, and Power Automate.
  • Strong problem-solving and analytical skills.
  • Excellent communication and interpersonal skills.
  • Strong attention to detail and commitment to data quality.
  • Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.
This advertiser has chosen not to accept applicants from your region.

ETL Developer

Narela, Delhi Pinnacle Group, Inc.

Posted 22 days ago

Job Viewed

Tap Again To Close

Job Description

About PTR Global

PTR Global is a leader in providing innovative workforce solutions, dedicated to optimizing talent acquisition and management processes. Our commitment to excellence has earned us the trust of businesses looking to enhance their talent strategies. We cultivate a dynamic and collaborative environment that empowers our employees to excel and contribute to our clients' success.


Job Summary

We are seeking a highly skilled ETL Developer to join our team in Chennai. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives. This role requires proficiency in T-SQL, Azure Data Factory (ADF), and SSIS, along with excellent problem-solving and communication skills.


Responsibilities

  • Design, develop, and maintain ETL processes to support data integration and business intelligence initiatives.
  • Utilize T-SQL to write complex queries and stored procedures for data extraction and transformation.
  • Implement and manage ETL processes using SSIS (SQL Server Integration Services).
  • Design and model data warehouses to support reporting and analytics needs.
  • Ensure data accuracy, quality, and integrity through effective testing and validation procedures.
  • Collaborate with business analysts and stakeholders to understand data requirements and deliver solutions that meet their needs.
  • Monitor and troubleshoot ETL processes to ensure optimal performance and resolve any issues promptly.
  • Document ETL processes, workflows, and data mappings to ensure clarity and maintainability.
  • Stay current with industry trends and best practices in ETL development, data integration, and data warehousing.


Must Haves

  • Minimum 4+ years of experience as an ETL Developer or in a similar role.
  • Proficiency in T-SQL for writing complex queries and stored procedures.
  • Experience with SSIS (SQL Server Integration Services) for developing and managing ETL processes.
  • Knowledge of ADF (Azure Data Factory) and its application in ETL processes.
  • Experience in data warehouse design and modeling.
  • Knowledge of Microsoft's Azure cloud suite, including Data Factory, Data Storage, Blob Storage, Power BI, and Power Automate.
  • Strong problem-solving and analytical skills.
  • Excellent communication and interpersonal skills.
  • Strong attention to detail and commitment to data quality.
  • Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data pipelines Jobs in Delhi !

ETL Developer

Delhi, Delhi Pinnacle Group, Inc.

Posted 22 days ago

Job Viewed

Tap Again To Close

Job Description

About PTR Global

PTR Global is a leader in providing innovative workforce solutions, dedicated to optimizing talent acquisition and management processes. Our commitment to excellence has earned us the trust of businesses looking to enhance their talent strategies. We cultivate a dynamic and collaborative environment that empowers our employees to excel and contribute to our clients' success.


Job Summary

We are seeking a highly skilled ETL Developer to join our team in Chennai. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives. This role requires proficiency in T-SQL, Azure Data Factory (ADF), and SSIS, along with excellent problem-solving and communication skills.


Responsibilities

  • Design, develop, and maintain ETL processes to support data integration and business intelligence initiatives.
  • Utilize T-SQL to write complex queries and stored procedures for data extraction and transformation.
  • Implement and manage ETL processes using SSIS (SQL Server Integration Services).
  • Design and model data warehouses to support reporting and analytics needs.
  • Ensure data accuracy, quality, and integrity through effective testing and validation procedures.
  • Collaborate with business analysts and stakeholders to understand data requirements and deliver solutions that meet their needs.
  • Monitor and troubleshoot ETL processes to ensure optimal performance and resolve any issues promptly.
  • Document ETL processes, workflows, and data mappings to ensure clarity and maintainability.
  • Stay current with industry trends and best practices in ETL development, data integration, and data warehousing.


Must Haves

  • Minimum 4+ years of experience as an ETL Developer or in a similar role.
  • Proficiency in T-SQL for writing complex queries and stored procedures.
  • Experience with SSIS (SQL Server Integration Services) for developing and managing ETL processes.
  • Knowledge of ADF (Azure Data Factory) and its application in ETL processes.
  • Experience in data warehouse design and modeling.
  • Knowledge of Microsoft's Azure cloud suite, including Data Factory, Data Storage, Blob Storage, Power BI, and Power Automate.
  • Strong problem-solving and analytical skills.
  • Excellent communication and interpersonal skills.
  • Strong attention to detail and commitment to data quality.
  • Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.
This advertiser has chosen not to accept applicants from your region.

Associate Architect - Data Engineering

New Delhi, Delhi Response Informatics

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

About the Role:

We are seeking an experienced Data Architect to lead the transformation of enterprise data

solutions, with a strong focus on migrating Alteryx workflows into Azure Databricks. The

ideal candidate will have deep expertise in the Microsoft Azure ecosystem, including Azure

Data Factory, Databricks, Synapse Analytics, Microsoft Fabric, and a strong

background in data architecture, governance, and distributed computing. This role

requires both strategic thinking and hands-on architectural leadership to ensure scalable,

secure, and high-performance data solutions.


Key Responsibilities:

Define the overall migration strategy for transforming Alteryx workflows into

scalable, cloud-native data solutions on Azure Databricks.

Architect end-to-end data frameworks leveraging Databricks, Delta Lake, Azure

Data Lake, and Synapse.

Establish best practices, standards, and governance frameworks for pipeline

design, orchestration, and data lifecycle management.

Guide engineering teams in re-engineering Alteryx workflows into distributed Spark-

based architectures.

Collaborate with business stakeholders to ensure solutions align with analytics,

reporting, and advanced AI/ML initiatives.

Oversee data quality, lineage, and security compliance across the data

ecosystem.

Drive CI/CD adoption, automation, and DevOps practices for Azure Databricks

and related services.

Provide architectural leadership, design reviews, and mentorship to engineering

and analytics teams.

Optimize solutions for performance, scalability, and cost-efficiency within Azure.

Participate in enterprise architecture forums and influence data strategy across the

organization.


Required Skills and Qualifications:

10+ years of experience in data architecture, engineering, or solution design.

Proven expertise in Alteryx workflows and their modernization into Azure

Databricks (Spark, PySpark, SQL, Delta Lake).

Deep knowledge of the Microsoft Azure data ecosystem:

o Azure Data Factory (ADF)

o Azure Synapse Analytics


o Microsoft Fabric

o Azure Databricks

Strong background in data governance, lineage, security, and compliance

frameworks.

Demonstrated experience in architecting data lakes, data warehouses, and

analytics platforms.

Proficiency in Python, SQL, and Apache Spark for prototyping and design

validation.

Excellent leadership, communication, and stakeholder management skills.


Preferred Qualifications:

Microsoft Azure certifications (e.g., Azure Solutions Architect Expert, Azure Data

Engineer Associate).

Experience leading large-scale migration programs or modernization initiatives.

Familiarity with enterprise architecture frameworks (TOGAF, Zachman).

Exposure to machine learning enablement on Azure Databricks.

Strong understanding of Agile delivery and working in multi-disciplinary teams.

This advertiser has chosen not to accept applicants from your region.

Associate Architect - Data Engineering

Narela, Delhi Response Informatics

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

About the Role:

We are seeking an experienced Data Architect to lead the transformation of enterprise data

solutions, with a strong focus on migrating Alteryx workflows into Azure Databricks. The

ideal candidate will have deep expertise in the Microsoft Azure ecosystem, including Azure

Data Factory, Databricks, Synapse Analytics, Microsoft Fabric, and a strong

background in data architecture, governance, and distributed computing. This role

requires both strategic thinking and hands-on architectural leadership to ensure scalable,

secure, and high-performance data solutions.


Key Responsibilities:

Define the overall migration strategy for transforming Alteryx workflows into

scalable, cloud-native data solutions on Azure Databricks.

Architect end-to-end data frameworks leveraging Databricks, Delta Lake, Azure

Data Lake, and Synapse.

Establish best practices, standards, and governance frameworks for pipeline

design, orchestration, and data lifecycle management.

Guide engineering teams in re-engineering Alteryx workflows into distributed Spark-

based architectures.

Collaborate with business stakeholders to ensure solutions align with analytics,

reporting, and advanced AI/ML initiatives.

Oversee data quality, lineage, and security compliance across the data

ecosystem.

Drive CI/CD adoption, automation, and DevOps practices for Azure Databricks

and related services.

Provide architectural leadership, design reviews, and mentorship to engineering

and analytics teams.

Optimize solutions for performance, scalability, and cost-efficiency within Azure.

Participate in enterprise architecture forums and influence data strategy across the

organization.


Required Skills and Qualifications:

10+ years of experience in data architecture, engineering, or solution design.

Proven expertise in Alteryx workflows and their modernization into Azure

Databricks (Spark, PySpark, SQL, Delta Lake).

Deep knowledge of the Microsoft Azure data ecosystem:

o Azure Data Factory (ADF)

o Azure Synapse Analytics


o Microsoft Fabric

o Azure Databricks

Strong background in data governance, lineage, security, and compliance

frameworks.

Demonstrated experience in architecting data lakes, data warehouses, and

analytics platforms.

Proficiency in Python, SQL, and Apache Spark for prototyping and design

validation.

Excellent leadership, communication, and stakeholder management skills.


Preferred Qualifications:

Microsoft Azure certifications (e.g., Azure Solutions Architect Expert, Azure Data

Engineer Associate).

Experience leading large-scale migration programs or modernization initiatives.

Familiarity with enterprise architecture frameworks (TOGAF, Zachman).

Exposure to machine learning enablement on Azure Databricks.

Strong understanding of Agile delivery and working in multi-disciplinary teams.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Pipelines Jobs View All Jobs in Delhi