132 Data Pipeline jobs in India
Data Pipeline Irc173004
Posted today
Job Viewed
Job Description
- IRC173004- Location:
- India - Nagpur- Designation:
- Associate Consultant- Experience:
- 5-10 years- Function:
- Engineering**Description**:
seeking an experienced data pipelines engineer for our data engineering organization. As a member of our team, you will develop highly-performant, scalable and available data pipelines for bp internal
- and external-facing business domains. You must be able to understand business data requirements, be familiar with cloud-based data pipelining tools and thrive working in a fast-paced environment.
**Requirements**:
- 3+ years of hands-on software development, data engineering, and systems architecture
- 2+ years of experience in designing, developing and maintaining high performance, low latency and large-scale data pipelines.
- 2+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.)
- 2+ years in cloud data engineering experience in Azure or/and AWS
- Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
- Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines
- Experience with modern data technologies such as SQL and non-SQL databases, REST / GraphQL APIs, data event streaming technologies, and in-memory data storage.
- Experience with version control systems like Github and deployment & CI/CD tools
- Excellent communication skills, both verbal and written
**Preferences**:
data pipeline, etl, sql
**Responsibilities**:
- As a member of our team, you will design, develop, test and maintain highly performant, scalable, robust and available data pipelines for our business teams.
- Work closely with the data science and engineering team and the data platform team to implement automated data pipeline workflows in the Azure/AWS cloud platforms to enable data ingestion, processing, and distribution.
- Deploy scalable solutions typical to critical bp business environments.
- Build the automation and monitoring frameworks that capture metrics and operational KPIs for data pipeline quality and performance.
- Build prototypes and workflows to try out new ideas in an iterative manner
**What We Offer**:
**Exciting Projects**: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them.
**Collaborative Environment**: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!
**Work-Life Balance**: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays.
**Professional Development**: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings.
**Excellent Benefits**: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance, NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.
**Fun Perks**: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
**About GlobalLogic**: GlobalLogic is a leader in digital engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries. GlobalLogic is a Hitachi Group Company operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business.
Cyber Data Pipeline Engineer
Posted 4 days ago
Job Viewed
Job Description
Title: Cyber Data Pipeline Engineer
Location : Bengaluru
Experience: 7 to 14 years
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Cyber Data Pipeline Engineer
Posted 3 days ago
Job Viewed
Job Description
Title: Cyber Data Pipeline Engineer
Location: Bengaluru
Experience: 7 to 14 years
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Cyber data pipeline engineer
Posted today
Job Viewed
Job Description
Title: Cyber Data Pipeline Engineer Location: BengaluruExperience: 7 to 14 years Role ProfileA successful applicant will contribute to several important initiatives including:Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.Developing automation tools that integrate with in-house developed configuration management frameworks and APIsMonitor the health and performance of the data pipeline infrastructure.Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issuesCreate and maintain detailed documentation for pipeline architecture, processes, and integrations.Required SkillsHands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache Ni FiHands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.Hands-on experience in developing and validating field extraction using regular expressions.A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.Good understanding of software version control, deployment & build tools using Dev Ops SDLC practices (Git, Jenkins, Jira)Strong analytical and troubleshooting skillsExcellent verbal & written communication skillsAppreciation of Agile methodologies, specifically KanbanDesired SkillsEnterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQInfrastructure automation and integration experience, ideally using Python and AnsibleFamiliarity with cybersecurity concepts, event types, and monitoring requirements.Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Cyber Data Pipeline Engineer
Posted 3 days ago
Job Viewed
Job Description
Location : Bengaluru
Experience: 7 to 14 years
Role Profile
A successful applicant will contribute to several important initiatives including:
Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
Monitor the health and performance of the data pipeline infrastructure.
Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
Hands-on experience in developing and validating field extraction using regular expressions.
A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
Strong analytical and troubleshooting skills
Excellent verbal & written communication skills
Appreciation of Agile methodologies, specifically Kanban
Desired Skills
Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
Infrastructure automation and integration experience, ideally using Python and Ansible
Familiarity with cybersecurity concepts, event types, and monitoring requirements.
Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Senior - Dev-Data-Pipeline
Posted today
Job Viewed
Job Description
KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.
Context
KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment.
We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects.
Job Description
Role Objective:
Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend).
Location- Mumbai
Years of Experience - 3-5 yrs
Roles & Responsibilities:
Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution.
Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape.
Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance.
Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices.
Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution.
Technical Skills:
Core Tool exposure - Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.)
Core Concepts - ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement
Cloud exposure - Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.)
SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding
Soft Skills-
Very good communication and presentation skills
Must be able to articulate the thoughts and convince key stakeholders
Should be able to guide and upskill team members
Good to Have:
Programming Language: Knowledge and hands-on experience with languages like Python and R.
Relevant certifications related to the role
Equal employment opportunity information
KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Bachelors
Data Pipeline Support Engineer
Posted today
Job Viewed
Job Description
Data Pipeline Support Engineer Location: India based, remote work. Time: 10:30 AM – 6:30 PM IST (Aligned to US Central Time) Rate: $20 - $25/h About Data Meaning Data Meaning is a front-runner in Business Intelligence and Data Analytics consulting, renowned for our high-quality consulting services throughout the US and LATAM.
Our expertise lies in delivering tailored solutions in Business Intelligence, Data Warehousing, and Project Management.
We are dedicated to bringing premier services to our diverse clientele.
We have a global team of 95+ consultants, all working remotely, embodying a collaborative, inclusive, and innovative-driven work culture.
Position Summary: We are seeking a proactive Data Pipeline Support Engineer based in India to work the 12:00–8:00 AM Central Time (US) shift The ideal candidate must have strong hands-on experience with Azure Data Factory, Alteryx (including reruns and macros), and dbt.
This role requires someone detail-oriented, capable of independently managing early-morning support for critical workflows, and comfortable collaborating across time zones in a fast-paced data operations environment.
Responsibilities: Monitor and support data jobs in Alteryx, dbt, Snowflake, and ADF during 12:00–8:00 AM CT Perform first-pass remediation (e.g., reruns, credential resets, basic troubleshooting, etc).
Escalate unresolved or complex issues to nearshore Tier 2/3 support teams.
Log all incidents and resolutions in ticketing and audit systems (ServiceNow).
Collaborate with CT-based teams for smooth cross-timezone handoffs.
Contribute to automation improvements (e.g., Alteryx macros, retry logic).
Required Skills & Qualifications: Strong, hands-on experience in Alteryx (monitoring, reruns, macros).
Working knowledge of dbt (Data Build Tool) and Snowflake (basic SQL, Snowpark, data validation).
Experience with Azure Data Factory (ADF) pipeline executions Familiarity with SAP BW, workflow chaining, and cron-based job scheduling.
Familiarity with data formats, languages, protocols, and architecture styles required to provide Azure-based integration solutions (for example,.
NET, JSON, REST, and SOAP) including Azure Functions.
Excellent communication skills in English (written and verbal).
Ability to work independently and handle incident resolution with limited documentation.
Required Certifications: Alteryx Designer Core Certification SnowPro Core Certification dbt Fundamentals Course Certificate Microsoft Certified: Azure Data Engineer Associate ITIL v4 Foundation (or equivalent) Preferred Certifications: Alteryx Designer Advanced Certification Alteryx Server Administration SnowPro Advanced: Data Engineer dbt Analytics Engineering Certification Microsoft Certified: Azure Fundamentals (AZ-900) Powered by JazzHR
Be The First To Know
About the latest Data pipeline Jobs in India !
Cyber Data Pipeline Engineer
Posted today
Job Viewed
Job Description
Title: Cyber Data Pipeline Engineer
Location : Bengaluru
Experience: 7 to 14 years
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Executive - Dev-Data-Pipeline
Posted today
Job Viewed
Job Description
KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.
Context
KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment.
We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects.
Job Description
Role Objective:
Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend).
Location- Mumbai
Years of Experience - 3-5 yrs
Roles & Responsibilities:
Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution.
Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape.
Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance.
Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices.
Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution.
Technical Skills:
Core Tool exposure - Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.)
Core Concepts - ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement
Cloud exposure - Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.)
SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding
Soft Skills-
Very good communication and presentation skills
Must be able to articulate the thoughts and convince key stakeholders
Should be able to guide and upskill team members
Good to Have:
Programming Language: Knowledge and hands-on experience with languages like Python and R.
Relevant certifications related to the role
Equal employment opportunity information
KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Bachelors
Cyber Data Pipeline Engineer
Posted 1 day ago
Job Viewed
Job Description
Title: Cyber Data Pipeline Engineer
Location : Bengaluru
Experience: 7 to 14 years
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)