62 Data Pipelines jobs in Delhi
APM Data Integration Analyst
Posted today
Job Viewed
Job Description
Location - Remote
Timings - 3pm – 11pm IST
APM Data Integration Analyst
This role ensures data integrity, integration, and governance across the APM ecosystem. The Data Integration Analyst will manage application metadata completeness and accuracy, working across Orbus and ServiceNow, integrating with Informatica, and other systems to maintain a single source of truth. They will also help design and enforce data quality rules and cross-references, enabling reliable reporting and decision-making.
Expectations:
- Ensure consistency and accuracy of application metadata in Orbus consistent with ServiceNow.
- Design and implement data governance rules (validation, de-duplication, attribute mapping, normalization).
- Partner with integration teams to establish data pipelines and synchronization (Informatica, EDW, ServiceNow, Orbus, etc.).
- Map application attributes across multiple systems, ensuring cross-referencing accuracy.
- Maintain data lineage and traceability for reporting and audits.
- Gather application usage data and metrics.
- Provide proactive data quality monitoring and monitor dashboards to highlight gaps or issues.
- Document data models, attribute definitions, and mapping rules for transparency and onboarding.
Core Competencies:
- Highly detail-oriented with a passion for data quality.
- Proactive in identifying and resolving gaps and discrepancies.
- Able to balance technical accuracy with business usability.
- Comfortable in a complex, fragmented tool landscape.
- Action Oriented: Proactive in identifying and resolving gaps.
- Problem Solving: Uses rigorous logic and methods to solve difficult problems with effective solutions; probes all fruitful sources for answers; can see hidden problems; is excellent at honest analysis; looks beyond the obvious and doesn't stop at the first answers
Qualifications
- Strong hands-on Orbus experience (data modeling, meta model configuration, attribute management).
- Hands-on ServiceNow SAM/APM experience (application and contract data, integrations).
- Proven track record of Application Portfolio Management activities.
- Experience with data mapping capabilities.
- Experience with Informatica, EDW integration, and Power BI reporting preferred.
- Knowledge of data governance frameworks (DAMA, ISO 8000 preferred).
- Strong documentation and requirements gathering skills.
Education:
Bachelor’s Degree is preferred or experience equal to.
APM Data Integration Analyst
Posted 13 days ago
Job Viewed
Job Description
Title - APM Data Integration Analyst
Location - Remote
Timings - 3pm – 11pm IST
APM Data Integration Analyst
This role ensures data integrity, integration, and governance across the APM ecosystem. The Data Integration Analyst will manage application metadata completeness and accuracy, working across Orbus and ServiceNow, integrating with Informatica, and other systems to maintain a single source of truth. They will also help design and enforce data quality rules and cross-references, enabling reliable reporting and decision-making.
Expectations:
- Ensure consistency and accuracy of application metadata in Orbus consistent with ServiceNow.
- Design and implement data governance rules (validation, de-duplication, attribute mapping, normalization).
- Partner with integration teams to establish data pipelines and synchronization (Informatica, EDW, ServiceNow, Orbus, etc.).
- Map application attributes across multiple systems, ensuring cross-referencing accuracy.
- Maintain data lineage and traceability for reporting and audits.
- Gather application usage data and metrics.
- Provide proactive data quality monitoring and monitor dashboards to highlight gaps or issues.
- Document data models, attribute definitions, and mapping rules for transparency and onboarding.
Core Competencies:
- Highly detail-oriented with a passion for data quality.
- Proactive in identifying and resolving gaps and discrepancies.
- Able to balance technical accuracy with business usability.
- Comfortable in a complex, fragmented tool landscape.
- Action Oriented: Proactive in identifying and resolving gaps.
- Problem Solving: Uses rigorous logic and methods to solve difficult problems with effective solutions; probes all fruitful sources for answers; can see hidden problems; is excellent at honest analysis; looks beyond the obvious and doesn't stop at the first answers
Qualifications
- Strong hands-on Orbus experience (data modeling, meta model configuration, attribute management).
- Hands-on ServiceNow SAM/APM experience (application and contract data, integrations).
- Proven track record of Application Portfolio Management activities.
- Experience with data mapping capabilities.
- Experience with Informatica, EDW integration, and Power BI reporting preferred.
- Knowledge of data governance frameworks (DAMA, ISO 8000 preferred).
- Strong documentation and requirements gathering skills.
Education:
Bachelor’s Degree is preferred or experience equal to.
APM Data Integration Analyst
Posted 13 days ago
Job Viewed
Job Description
Title - APM Data Integration Analyst
Location - Remote
Timings - 3pm – 11pm IST
APM Data Integration Analyst
This role ensures data integrity, integration, and governance across the APM ecosystem. The Data Integration Analyst will manage application metadata completeness and accuracy, working across Orbus and ServiceNow, integrating with Informatica, and other systems to maintain a single source of truth. They will also help design and enforce data quality rules and cross-references, enabling reliable reporting and decision-making.
Expectations:
- Ensure consistency and accuracy of application metadata in Orbus consistent with ServiceNow.
- Design and implement data governance rules (validation, de-duplication, attribute mapping, normalization).
- Partner with integration teams to establish data pipelines and synchronization (Informatica, EDW, ServiceNow, Orbus, etc.).
- Map application attributes across multiple systems, ensuring cross-referencing accuracy.
- Maintain data lineage and traceability for reporting and audits.
- Gather application usage data and metrics.
- Provide proactive data quality monitoring and monitor dashboards to highlight gaps or issues.
- Document data models, attribute definitions, and mapping rules for transparency and onboarding.
Core Competencies:
- Highly detail-oriented with a passion for data quality.
- Proactive in identifying and resolving gaps and discrepancies.
- Able to balance technical accuracy with business usability.
- Comfortable in a complex, fragmented tool landscape.
- Action Oriented: Proactive in identifying and resolving gaps.
- Problem Solving: Uses rigorous logic and methods to solve difficult problems with effective solutions; probes all fruitful sources for answers; can see hidden problems; is excellent at honest analysis; looks beyond the obvious and doesn't stop at the first answers
Qualifications
- Strong hands-on Orbus experience (data modeling, meta model configuration, attribute management).
- Hands-on ServiceNow SAM/APM experience (application and contract data, integrations).
- Proven track record of Application Portfolio Management activities.
- Experience with data mapping capabilities.
- Experience with Informatica, EDW integration, and Power BI reporting preferred.
- Knowledge of data governance frameworks (DAMA, ISO 8000 preferred).
- Strong documentation and requirements gathering skills.
Education:
Bachelor’s Degree is preferred or experience equal to.
APM Data Integration Analyst
Posted 13 days ago
Job Viewed
Job Description
Title - APM Data Integration Analyst
Location - Remote
Timings - 3pm – 11pm IST
APM Data Integration Analyst
This role ensures data integrity, integration, and governance across the APM ecosystem. The Data Integration Analyst will manage application metadata completeness and accuracy, working across Orbus and ServiceNow, integrating with Informatica, and other systems to maintain a single source of truth. They will also help design and enforce data quality rules and cross-references, enabling reliable reporting and decision-making.
Expectations:
- Ensure consistency and accuracy of application metadata in Orbus consistent with ServiceNow.
- Design and implement data governance rules (validation, de-duplication, attribute mapping, normalization).
- Partner with integration teams to establish data pipelines and synchronization (Informatica, EDW, ServiceNow, Orbus, etc.).
- Map application attributes across multiple systems, ensuring cross-referencing accuracy.
- Maintain data lineage and traceability for reporting and audits.
- Gather application usage data and metrics.
- Provide proactive data quality monitoring and monitor dashboards to highlight gaps or issues.
- Document data models, attribute definitions, and mapping rules for transparency and onboarding.
Core Competencies:
- Highly detail-oriented with a passion for data quality.
- Proactive in identifying and resolving gaps and discrepancies.
- Able to balance technical accuracy with business usability.
- Comfortable in a complex, fragmented tool landscape.
- Action Oriented: Proactive in identifying and resolving gaps.
- Problem Solving: Uses rigorous logic and methods to solve difficult problems with effective solutions; probes all fruitful sources for answers; can see hidden problems; is excellent at honest analysis; looks beyond the obvious and doesn't stop at the first answers
Qualifications
- Strong hands-on Orbus experience (data modeling, meta model configuration, attribute management).
- Hands-on ServiceNow SAM/APM experience (application and contract data, integrations).
- Proven track record of Application Portfolio Management activities.
- Experience with data mapping capabilities.
- Experience with Informatica, EDW integration, and Power BI reporting preferred.
- Knowledge of data governance frameworks (DAMA, ISO 8000 preferred).
- Strong documentation and requirements gathering skills.
Education:
Bachelor’s Degree is preferred or experience equal to.
Data Integration & LLM Engineer
Posted 13 days ago
Job Viewed
Job Description
About the Role
We are seeking a highly motivated Software Engineer with a strong foundation in Java (Spring Boot) , data integration , and a growing expertise in Large Language Models (LLMs) . This role is ideal for engineers who enjoy working at the intersection of scalable data systems and AI-driven applications , building robust pipelines while also exploring cutting-edge generative AI solutions.
Key Responsibilities
- Design and implement data integrations including APIs, SaaS connectors, and ETL/ELT pipelines to ensure reliable and scalable data flows.
- Build and maintain backend services and applications using Java (Spring Boot or equivalent frameworks) .
- Develop Python-based workflows for AI/ML pipelines, experimentation, and automation scripting.
- Integrate and experiment with LLMs (OpenAI, Anthropic, LLaMA, Mistral, etc.) for use cases such as retrieval-augmented generation (RAG), summarization, and intelligent data insights.
- Implement vector search solutions using Pinecone, Weaviate, Milvus, or FAISS for LLM-backed applications.
- Collaborate with product, data, and ML teams to design end-to-end solutions that combine data engineering with AI capabilities .
- Ensure systems meet high standards of performance, scalability, security, and compliance .
Required Qualifications
- Strong programming experience in Java (Spring Boot or equivalent frameworks) .
- Familiarity with Python , particularly for AI/ML workflows and scripting .
- Proven experience with data integrations : APIs, SaaS connectors, ETL/ELT pipelines.
- Exposure to LLMs (OpenAI, Anthropic, LLaMA, Mistral, etc.) and associated frameworks (LangChain, LlamaIndex, Hugging Face Transformers).
- Experience working with databases (SQL/NoSQL) and vector search technologies (Pinecone, Weaviate, Milvus, FAISS).
Preferred Skills
- Knowledge of cloud platforms (AWS, GCP, or Azure) for deploying scalable systems and ML workloads.
- Familiarity with containerization and orchestration (Docker, Kubernetes).
- Understanding of data governance, observability, and security best practices .
- Interest in generative AI advancements and a passion for building practical applications on top of them.
Data Integration & LLM Engineer
Posted 13 days ago
Job Viewed
Job Description
About the Role
We are seeking a highly motivated Software Engineer with a strong foundation in Java (Spring Boot) , data integration , and a growing expertise in Large Language Models (LLMs) . This role is ideal for engineers who enjoy working at the intersection of scalable data systems and AI-driven applications , building robust pipelines while also exploring cutting-edge generative AI solutions.
Key Responsibilities
- Design and implement data integrations including APIs, SaaS connectors, and ETL/ELT pipelines to ensure reliable and scalable data flows.
- Build and maintain backend services and applications using Java (Spring Boot or equivalent frameworks) .
- Develop Python-based workflows for AI/ML pipelines, experimentation, and automation scripting.
- Integrate and experiment with LLMs (OpenAI, Anthropic, LLaMA, Mistral, etc.) for use cases such as retrieval-augmented generation (RAG), summarization, and intelligent data insights.
- Implement vector search solutions using Pinecone, Weaviate, Milvus, or FAISS for LLM-backed applications.
- Collaborate with product, data, and ML teams to design end-to-end solutions that combine data engineering with AI capabilities .
- Ensure systems meet high standards of performance, scalability, security, and compliance .
Required Qualifications
- Strong programming experience in Java (Spring Boot or equivalent frameworks) .
- Familiarity with Python , particularly for AI/ML workflows and scripting .
- Proven experience with data integrations : APIs, SaaS connectors, ETL/ELT pipelines.
- Exposure to LLMs (OpenAI, Anthropic, LLaMA, Mistral, etc.) and associated frameworks (LangChain, LlamaIndex, Hugging Face Transformers).
- Experience working with databases (SQL/NoSQL) and vector search technologies (Pinecone, Weaviate, Milvus, FAISS).
Preferred Skills
- Knowledge of cloud platforms (AWS, GCP, or Azure) for deploying scalable systems and ML workloads.
- Familiarity with containerization and orchestration (Docker, Kubernetes).
- Understanding of data governance, observability, and security best practices .
- Interest in generative AI advancements and a passion for building practical applications on top of them.
Data Integration & LLM Engineer
Posted 13 days ago
Job Viewed
Job Description
About the Role
We are seeking a highly motivated Software Engineer with a strong foundation in Java (Spring Boot) , data integration , and a growing expertise in Large Language Models (LLMs) . This role is ideal for engineers who enjoy working at the intersection of scalable data systems and AI-driven applications , building robust pipelines while also exploring cutting-edge generative AI solutions.
Key Responsibilities
- Design and implement data integrations including APIs, SaaS connectors, and ETL/ELT pipelines to ensure reliable and scalable data flows.
- Build and maintain backend services and applications using Java (Spring Boot or equivalent frameworks) .
- Develop Python-based workflows for AI/ML pipelines, experimentation, and automation scripting.
- Integrate and experiment with LLMs (OpenAI, Anthropic, LLaMA, Mistral, etc.) for use cases such as retrieval-augmented generation (RAG), summarization, and intelligent data insights.
- Implement vector search solutions using Pinecone, Weaviate, Milvus, or FAISS for LLM-backed applications.
- Collaborate with product, data, and ML teams to design end-to-end solutions that combine data engineering with AI capabilities .
- Ensure systems meet high standards of performance, scalability, security, and compliance .
Required Qualifications
- Strong programming experience in Java (Spring Boot or equivalent frameworks) .
- Familiarity with Python , particularly for AI/ML workflows and scripting .
- Proven experience with data integrations : APIs, SaaS connectors, ETL/ELT pipelines.
- Exposure to LLMs (OpenAI, Anthropic, LLaMA, Mistral, etc.) and associated frameworks (LangChain, LlamaIndex, Hugging Face Transformers).
- Experience working with databases (SQL/NoSQL) and vector search technologies (Pinecone, Weaviate, Milvus, FAISS).
Preferred Skills
- Knowledge of cloud platforms (AWS, GCP, or Azure) for deploying scalable systems and ML workloads.
- Familiarity with containerization and orchestration (Docker, Kubernetes).
- Understanding of data governance, observability, and security best practices .
- Interest in generative AI advancements and a passion for building practical applications on top of them.
Be The First To Know
About the latest Data pipelines Jobs in Delhi !
Data Integration & Modeling Specialist
Posted today
Job Viewed
Job Description
Job Title: Data Integration & Modeling Specialist
Job Type: Contract
Location: Remote
Duration: 6 Months
Job Summary:
We are seeking a highly skilled Data Integration & Modeling Specialist with hands-on experience in developing common metamodels, defining integration specifications, and working with semantic web technologies and various data formats. The ideal candidate will bring deep technical expertise and a collaborative mindset to support enterprise-level data integration and standardization initiatives.
Key Responsibilities:
Develop common metamodels by integrating requirements across diverse systems and organizations.
Define integration specifications, establish data standards, and develop logical and physical data models.
Collaborate with stakeholders to align data architectures with organizational needs and industry best practices.
Implement and govern semantic data solutions using RDF and SPARQL.
Perform data transformations and scripting using TCL, Python, and Java.
Work with multiple data formats including FRL, VRL, HRL, XML, and JSON to support integration and processing pipelines.
Document technical specifications and provide guidance on data standards and modeling best practices.
Required Qualifications:
3+ years of experience (within the last 8 years) in developing common metamodels, preferably using NIEM standards.
3+ years of experience (within the last 8 years) in:
Defining integration specifications
Developing data models
Governing data standards
2+ years of recent experience with:
Tool Command Language (TCL)
Python
Java
2+ years of experience with:
Resource Description Framework (RDF)
SPARQL Query Language
2+ years of experience working with:
Fixed Record Layout (FRL)
Variable Record Layout (VRL)
Hierarchical Record Layout (HRL)
XML
JSONodeling Specialist
Data Integration & Modeling Specialist
Posted 6 days ago
Job Viewed
Job Description
Job Title: Data Integration & Modeling Specialist
Job Type: Contract
Location: Remote
Duration: 6 Months
Job Summary:
We are seeking a highly skilled Data Integration & Modeling Specialist with hands-on experience in developing common metamodels, defining integration specifications, and working with semantic web technologies and various data formats. The ideal candidate will bring deep technical expertise and a collaborative mindset to support enterprise-level data integration and standardization initiatives.
Key Responsibilities:
Develop common metamodels by integrating requirements across diverse systems and organizations.
Define integration specifications, establish data standards, and develop logical and physical data models.
Collaborate with stakeholders to align data architectures with organizational needs and industry best practices.
Implement and govern semantic data solutions using RDF and SPARQL.
Perform data transformations and scripting using TCL, Python, and Java.
Work with multiple data formats including FRL, VRL, HRL, XML, and JSON to support integration and processing pipelines.
Document technical specifications and provide guidance on data standards and modeling best practices.
Required Qualifications:
3+ years of experience (within the last 8 years) in developing common metamodels, preferably using NIEM standards.
3+ years of experience (within the last 8 years) in:
Defining integration specifications
Developing data models
Governing data standards
2+ years of recent experience with:
Tool Command Language (TCL)
Python
Java
2+ years of experience with:
Resource Description Framework (RDF)
SPARQL Query Language
2+ years of experience working with:
Fixed Record Layout (FRL)
Variable Record Layout (VRL)
Hierarchical Record Layout (HRL)
XML
JSONodeling Specialist
ETL Developer
Posted today
Job Viewed
Job Description
Job Title :
ETL Developer
Experience Required :
6 to 8 years
Qualification :
BCA / B.Tech / MCA / M.Tech
Key Responsibilities (KRA)
- Design, develop, and implement ETL solutions to exchange and process data using SSIS.
- Build, deploy, and maintain SSIS packages to support business and client requirements.
- Develop and maintain SSRS reports for business intelligence and analytics needs.
- Ensure data quality, integrity, and accuracy across ETL processes and reporting systems.
- Optimize SQL queries, stored procedures, and database structures for performance.
- Support operations by troubleshooting and enhancing existing ETL workflows and data pipelines.
- Collaborate with business stakeholders to gather requirements and translate them into technical solutions.
- Perform multiple levels of testing including unit, system, integration, and performance.
- Estimate effort, plan releases, and ensure timely delivery of ETL solutions.
- Maintain compliance with coding standards, best practices, and data governance policies.
Core Skills & Requirements
- Strong expertise in Microsoft SSIS for ETL development and deployment.
- Hands-on experience with SSRS report development and implementation.
- Solid understanding of Data Warehousing (DWH) concepts and methodologies.
- Proficiency in SQL/T-SQL/PL-SQL with experience writing complex queries, functions, and stored procedures.
- Experience with DBMS, particularly SQL Server, including performance tuning and troubleshooting.
- Good understanding of modern analytics tools and data integration processes.
- Strong verbal and written communication skills to interact with clients and business teams.
- Proven ability to work independently as well as in collaborative team environments.
)