1,729 Technical Integration jobs in India
BSI Technical Expert (Data Integration)
Posted 2 days ago
Job Viewed
Job Description
**We are looking for Technical Expert to be part of our Business Solutions Integrations team in the Analytics, Data and Integration stream.**
**Position Snapshot**
+ Location: Bengaluru
+ Type of Contract: Permanent
+ Analytics, Data and Integration
+ Type of work: Hybrid
+ Work Language: Fluent Business English
**The role**
The Integration Technical expert will be working in the Business Solution Integration team focused on the Product Engineering and Operations related to Data Integration, Digital integration, and Process Integration the products in the in-Business solution integration and the initiatives where these products are used.
Will work together with the Product Manager and Product Owners, as well as various other counterparts in the evolution of the DI, PI, and Digital Products. Will work with architects for orchestrating the design of the integration solutions. Will also act as the first point of contact for project teams to manage demand and will help to drive the transition from engineering to sustain as per the BSI standards.
Will work with Operations Managers and Sustain teams on the orchestration of the operations activities, proposing improvements for better performance of the platforms.
**What you'll do**
+ Work with architects to understand and orchestrate the design choices between the different Data, Process and Digital Integration patterns for fulfilling the data needs.
+ Translate the various requirements into the deliverables for the development and implementation of Process, Data and Digital Integration solutions, following up the requests for getting the work done.
+ Design, develop, and implement integration solutions using **ADF, LTRS, Data Integration** , SAP PO, CPI, Logic Apps MuleSoft, and Confluent.
+ Work with the Operations Managers and Sustain teams for orchestrating performance and operational issues.
**We offer you**
We offer more than just a job. We put people first and inspire you to become the best version of yourself.
+ **Great benefits** including competitive salary and a comprehensive social benefits package. We have one of the most competitive pension plans on the market, as well as flexible remuneration with tax advantages: health insurance, restaurant card, mobility plan, etc **.**
+ **Personal and professional growth** through ongoing training and constant career opportunities reflecting our conviction that people are our most important asset.
**Minimum qualifications:**
+ Minimum of 7 years industry experience in software delivery projects
+ Experience in project and product management, agile methodologies and solution delivery at scale.
+ Skilled and experienced Technical Integration Expert with experience various integration platforms and tools, including **ADF, LTRS, Data Integration** , SAP PO, CPI, Logic Apps, , MuleSoft, and Confluent.
+ Ability to contribute to a high-performing, motivated workgroup by applying interpersonal and collaboration skills to achieve goals.
+ Fluency in English with excellent oral and written communication skills.
+ Experience in working with cultural diversity: respect for various cultures and understanding how to work with a variety of cultures in the most effective way.
**Bonus Points If You:**
+ Experience with the Azure platform (especially with Data Factory)
+ Experience with Azure DevOps and with Service Now
+ Experience with Power Apps and Power BI
**About the IT Hub**
We are a team of IT professionals from many countries and diverse backgrounds, each with unique missions and challenges in the biggest health, nutrition and wellness company of the world. We innovate every day through forward-looking technologies to create opportunities for Nestlé's digital challenges with our consumers, customers and at the workplace. We collaborate with our business partners around the world to deliver standardized, integrated technology products and services to create tangible business value.
**About Nestlé**
We are Nestlé, the largest food and beverage company. We are approximately 275,000 employees strong, driven by the purpose of enhancing the quality of life and contributing to a healthier future. Our values are rooted in respect: respect for ourselves, respect for others, respect for diversity and respect for our future. With more than CHF 94.4 billion sales in 2022, we have an expansive presence, with 344 factories in 77 countries. Want to learn more? Visit us at .
_ _ _We encourage the diversity of applicants across gender, age, ethnicity, nationality, sexual orientation, social background, religion or belief and disability._
Step outside your comfort zone; share your ideas, way of thinking and working to make a difference to the world, every single day. You own a piece of the action - make it count.
**Join IT Hub Nestlé #beaforceforgood**
**How we will proceed:**
You send us your CV → We contact relevant applicants → Interviews → Feedback → Job Offer communication to the Finalist → First working day
Data Integration Engineer
Posted 4 days ago
Job Viewed
Job Description
Key Responsibilities:
· Develop and maintain ETL workflows using Informatica.
· Design and implement data pipelines for ingestion, transformation, and loading.
· Work with SQL and Python to manipulate and analyse data.
· Integrate data across various systems and platforms, including GCP and BigQuery.
· Ensure data quality, consistency, and security across all integrations.
· Collaborate with data architects, analysts, and business stakeholders.
Required Skills:
· Strong experience with Informatica and ETL development.
· Proficiency in Python and SQL.
· Hands-on experience with Google Cloud Platform (GCP) and Big Query.
· Solid understanding of data integration best practices and performance optimization.
SAP Data Integration
Posted 12 days ago
Job Viewed
Job Description
This is a remote position.
Duration: 6 months Location: Remote Timings: Full Time (As per company timings) Notice Period: (Immediate Joiner - Only) Experience: 6-9 Years JD: We seek a Senior Data Integration Developer with deep expertise in SAP Data Intelligence to support a large-scale enterprise data program. You will be responsible for designing, building, and optimizing SAP DI pipelines for data ingestion, transformation, and integration across multiple systems. Key Responsibilities Design, develop, and deploy data integration pipelines in SAP Data Intelligence. Integrate SAP and non-SAP data sources, ensuring scalability and performance. Implement data quality checks, metadata management, and monitoring. Collaborate with MDM teams, functional consultants, and business analysts to meet integration requirements. Troubleshoot issues and optimize workflows for efficiency. Prepare technical documentation and handover materials. 6+ years of data integration experience, with at least 3 years in SAP Data Intelligence. Strong skills in SAP DI Graphs, Operators, and connectivity with SAP HANA, S/4HANA, and cloud platforms. Experience with data transformation, cleansing, and enrichment processes. Proficiency in Python, SQL, and integration protocols (REST, OData, JDBC). Strong problem-solving and debugging skills.Data Integration Architect
Posted 9 days ago
Job Viewed
Job Description
The Global Power Market is amidst a fundamental transition from a central (Predictable, vertically integrated, one way) to a distributed (Intermittent, horizontally networked, bidirectional) model with increasing penetration of Renewables playing a key role in this transition.
RILs newly created Distributed Renewables (RE) business intends to accelerate this transition by providing safe, reliable, affordable, and accessible distributed green energy solutions to Indias population thereby improving quality of life.
Digital is the key enabler for the business to scale-up through the 3 pillars of agility, delightful customer experience and data driven decision making.
Work Location : Navi Mumbai
Department: Digital, Distributed Renewable Energy
Reporting to: Head, Digital Initiatives, Distributed Renewables
Job Overview:
We are seeking a highly skilled and experienced Data and Integration Architect to join our team. This role is crucial for designing and implementing robust data and integration architectures that support our company's strategic goals. The ideal candidate will possess a deep understanding of data architecture, data modeling, integration patterns, and the latest technologies in data integration and management. This position requires a strategic thinker who can collaborate with various stakeholders to ensure our data and integration frameworks are scalable, secure, and aligned with business needs.
Key Responsibilities:
1. Data Architecture Design : Develop and maintain an enterprise data architecture strategy that supports business objectives and aligns with the companys technology roadmap.
2. Integration Architecture Development: Design and implement integration solutions that seamlessly connect disparate systems both internally and with external partners, ensuring data consistency and accessibility.
3. Data Governance and Compliance: Establish and enforce data governance policies and procedures to ensure data integrity, quality, security, and compliance with relevant regulations.
4. System Evaluation and Selection: Evaluate and recommend technologies and platforms for data integration, management, and analytics, ensuring they meet the organizations needs.
5. Collaboration with IT and Business Teams: Work closely with IT teams, business analysts, and external partners to understand data and integration requirements and translate them into architectural solutions.
6. Performance and Scalability: Ensure the data and integration architecture supports high performance and scalability, addressing future growth and technology evolution.
7. Best Practices and Standards: Advocate for and implement industry best practices and standards in data management, integration, and architecture design.
8. Troubleshooting and Optimization: Identify and address data and integration bottlenecks, performing regular system audits and optimizations to improve performance and efficiency.
9. Documentation and Training: Develop comprehensive documentation for the data and integration architectures. Provide training and mentorship to IT staff and stakeholders on best practices.
Qualifications:
1. Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field.
2. Minimum of 7 years of experience in data architecture, integration, or a related field, with a proven track record of designing and implementing large-scale data and integration solutions.
3. Expert knowledge of data modeling, data warehousing, ETL processes, and integration patterns (APIs, microservices, messaging).
4. Experience with cloud-based data and integration platforms (e.g., AWS, Azure, Google Cloud Platform) and understanding of SaaS, PaaS, and IaaS models.
5. Strong understanding of data governance, data quality management, and compliance regulations (e.g., GDPR, HIPAA).
6. Proficient in SQL and NoSQL databases, data integration tools (e.g., Informatica, Talend, MuleSoft), and data visualization tools (e.g., Tableau, Power BI).
7. Excellent analytical, problem-solving, and project management skills.
8. Outstanding communication and interpersonal abilities, with the skill to articulate complex technical concepts to non-technical stakeholders.
What We Offer:
1. Opportunities for professional growth and advancement.
2. A dynamic and innovative work environment with a strong focus on collaboration and continuous learning.
3. The chance to work on cutting-edge projects, making a significant impact on the companys data strategy and operations.
This position offers an exciting opportunity for a seasoned Data and Integration Architect to play a key role in shaping the future of our data and integration strategies. If you are passionate about leveraging data to drive business success and thrive in a dynamic and collaborative environment, we encourage you to apply.
Data Engineer-Data Integration
Posted today
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing.
Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets.
In this role, your responsibilities may include:
* Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques
* Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements
* Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors.
* Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results
Your primary responsibilities include:
* Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools.
* Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT.
* Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues.
**Required technical and professional expertise**
* Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter
* Knowledge of Cloud, Power BI, Data migration on cloud skills.
* Experience in Unix shell scripting and python
* Experience with relational SQL, Big Data etc
**Preferred technical and professional experience**
* Knowledge of MS-Azure Cloud
* Experience in Informatica PowerCenter
* Experience in Unix shell scripting and python
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Data Engineer-Data Integration
Posted today
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client's needs.
* Your primary responsibilities include:
* Design, build, optimize and support new and existing data models and ETL processes based on our client's business requirements
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyse and model data to ensure optimal ETL design and performance.
* Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components.
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality Documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Data Engineer-Data Integration
Posted 1 day ago
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing.
Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets.
In this role, your responsibilities may include:
* Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques
* Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements
* Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors.
* Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results
Your primary responsibilities include:
* Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools.
* Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT.
* Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues.
**Required technical and professional expertise**
* Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter
* Knowledge of Cloud, Power BI, Data migration on cloud skills.
* Experience in Unix shell scripting and python
* Experience with relational SQL, Big Data etc
**Preferred technical and professional experience**
* Knowledge of MS-Azure Cloud
* Experience in Informatica PowerCenter
* Experience in Unix shell scripting and python
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Be The First To Know
About the latest Technical integration Jobs in India !
Data Engineer-Data Integration
Posted 2 days ago
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.
* Your primary responsibilities include:
* Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements.
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* aImplement data quality and validation processes within Ab Initio. Data Modeling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyze and model data to ensure optimal ETL design and performance.
* Ab Initio Components: Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality documentation
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Data Engineer-Data Integration
Posted 2 days ago
Job Viewed
Job Description
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio
**Your role and responsibilities**
* Hiring manager As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client's needs.
* Your primary responsibilities include: * Design, build, optimize and support new and existing data models and ETL processes based on our client's business requirements
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyse and model data to ensure optimal ETL design and performance.
* Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Data Engineer-Data Integration
Posted 2 days ago
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
Your primary responsibilities include:
* Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization.
* Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources.
* Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities.
* Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements.
* Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling.
* Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes.
* Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks
**Required technical and professional expertise**
* Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization.
* Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources.
* Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities.
* Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements
**Preferred technical and professional experience**
* Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling.
* Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes.
* Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.