33 Etl Development jobs in India
Backend and Data Pipeline Engineer
Posted 5 days ago
Job Viewed
Job Description
Job Role: Backend and Data Pipeline Engineer - Python
Location: Remote
Job Type : Fulltime
** Only Immediate Joiners **
Job Summary:
The Team:
We’re investing in technology to develop new products that help our customers drive their growth and transformation agenda. These include new data integration, advanced analytics, and modern applications that address new customer needs and are highly visible and strategic within the organization. Do you love building products on platforms at scale while leveraging cutting edge technology? Do you want to deliver innovative solutions to complex problems? If so, be part of our mighty team of engineers and play a key role in driving our business strategies.
The Impact:
We stand at cross-roads of innovation through Data Products to bring a competitive advantage to our business through the delivery of automotive forecasting solutions. Your work will contribute to the growth and success of our organization and provide valuable insights to our clients.
What’s in it for you:
We are looking for an innovative and mission-driven softwaredata engineer to make a significant impact by designing and developing AWS cloud native solutions that enables analysts to forecast long and short-term trends in the automotive industry. This role requires cutting edge data and cloud native technical expertise as well as the ability to work independently in a fast-paced, collaborative, and dynamic work environment.
Responsibilities:
- Design, develop, and maintain scalable data pipelines including complex algorithms
- Build and maintain UI backend services using Python or C# or similar, ensuring responsiveness and high performance
- Ensure data quality and integrity through robust validation processes
- Strong understanding of data integration and data modeling concepts
- Lead data integration projects and mentor junior engineers
- Collaborate with cross-functional teams to gather data requirements
- Collaborate with data scientists and analysts to optimize data flow and storage for advanced analytics
- Take ownership of the modules you work on, deliver on time and with quality, ensure software development best practices
- Utilize Redis for caching and data storage solutions to enhance application performance.
What We’re Looking For :
- Bachelor’s degree in computer science, or a related field
- Strong analytical and problem-solving skills
- 7+ years of experience in Data Engineering/Advanced Analytics
- Proficiency in Python and experience with Flask for backend development.
- Strong knowledge of object-oriented programming.
- AWS Proficiency is a big plus: ECR, Containers
Data Scientist Manager - Finance Transformation
Posted today
Job Viewed
Job Description
About the Job:
Greetings from Teamware Solutions a division of Quantum Leap Consulting Pvt Ltd
Job Description:
Role: Financial Data Scientist ( Finance Transformation Practice)
Experience: 10-15Yrs
Shift - 2:00 PM -11:00PM
Work Mode: Onsite / Bangalore (EGL)
Notice Period: Immediate to 20 Days preferred
Connect for Faster Comm LinkedIn
The Opportunity: Manager - Finance Transformation
Our Finance Transformation practice helps our clients’ finance functions improve their performance across a broad spectrum. As the role of finance evolves within an organization, balancing the need to be both a strategist and a steward, finance organizations will need to transform their processes, people, and analytic tools in order to manage these changes effectively. These changes include streamlining finance operations by increasing the efficiency of the finance organization and improving the effectiveness of finance by integrating performance management and analytic applications, such as performance score carding, reporting, budgeting, and forecasting capabilities.
Responsibilities:
Engagement Delivery
Work and lead on engagements with multidisciplinary teams, primarily focused on
leading/supporting ERP implementations
Serve key resource to bring technology and Finance processes together during ERP system
implementations
inance transformation (including within an ERP or Planning and Budgeting tool implementation)
onduct organizational diagnostics across all finance processes
onduct analyses in support of improvement initiatives (e.g. costing, cost-benefit, benchmarking)
nalyze financial and operational data;
stablish credibility with existing and new clients by demonstrating subject matter expertise and
knowledge in finance, strategy and EPM
endor and software selection (including business requirements gathering)
Engagement Management
upport the engagement director / partner in managing engagement risk and project economics
including planning and budgeting
repare and review deliverable content
nsure the quality of deliverables meets with Company and client expectations
Business Development
articipate in and lead aspects of the proposal development process
ontribute to proposal writing
eam with Company colleagues in other lines of services in support of client needs for Finance
Consulting services
upport the development of "thought leadership" and "point-of-view" documents
People Development:
erform role of coach; and actively participate in staff recruitment and retention activities
hare knowledge across the global Company network
rovide input and guidance into the firm’s staffing/resource management process
ively participate in staff recruitment
e aware and up to date with all technology and market developments
Requirements:
0-15 years of professional experience, including more than four years in consulting/professional
services
ig 4 consulting background, and/or boutique consulting experience is a plus
0+ years of professional experience in financial services
xperience managing complex projects within a professional services environment and a global
delivery model is ideal
xperience in handling and building financial data models (or finance information models) to
support finance and risk data.
nderstand dimensional data modeling. Can design and maintain conceptual, logical, and
physical data model.
ollaborate with stakeholders to ensure data models align with business rules and data
governance standards.
nderstanding of enterprise cloud platforms AWS, GCP or MS Azure capabilities relating to data
staging, normalization and transformation
xperience in interfacing modern ERP (SAP S4/Hana, Workday Financials, Oracle FSDF or
AFCS. Microsoft F&O) and / ore EPM (OneStream, Anaplan, Adaptive) tools with enterprise cloud
platforms
n addition, the candidate(s) must have experience in one or more of the following fields:
o Understanding of the core Finance areas
o Experience leading both external and internal teams on medium or large scale finance
transformations
o High level of client service orientation such as building solid relationships with clients and
vendors; approaching clients and vendors in an organized manner, demonstrating
flexibility in prioritizing and completing tasks
o Excellent team player with strong leadership skills and an understanding of personal and
team roles; contributing to a positive work environment and proactively seeking guidance
and feedback.
o Superior project management skills
o Excellent communication skills – oral, written, presentation
o Strong problem solving orientation and analytical skills.
Project Manager - Data Warehousing and Data Visualization
Posted 1 day ago
Job Viewed
Job Description
Project Manager - Data Warehousing and Data Visualization
GormalOne LLP | Mumbai |
Work Location : Whitefield Bangalore
GormalOne is on a mission to make dairy farming highly profitable, especially for thesmallest farmers living in the most neglected geographies. We are a dairy focused technologysolution providers with a vision to resolve the pain points of everyone in the dairy ecosystem.We are building a comprehensive platform on cattle management where farmers to AITs(Artificial Insemination technicians), para vets, veterinarians, consultants and corporates can collaborate and benefit each other using data. GormalOne is on a mission to make dairy farming highly profitable, especially for the smallest farmers living in the most neglected geographies. We are a dairy focused technology solution providers with a vision to resolve the pain points of everyone in the dairy ecosystem. We are building a comprehensive platform on cattle management where farmers to AITs (Artificial Insemination technicians), para vets, veterinarians, consultants and corporates can collaborate and benefit each other using data. Nitara offers an easy-to-use artificial intelligence-enabled herd management system for farmers/veterinary/paraprofessionals/AITs.
About the Role :
We are seeking a highly experienced Project Manager / Senior Project Manager with a strong background in Data Engineering, Data Warehousing / Data Lake, Data Pipeline, Data Visualization , Database technologies like Big Data, MongoDB, Postgres and Analytics to drive data-centric product initiatives for our DairyTech ecosystem. This role will focus on building scalable data platforms, enabling advanced analytics, and delivering actionable insights through intuitive dashboards and visualization tools to empower decision-making across supply chain, operations, sales, and customer engagement.
Key Responsibilities:
Product Strategy & Roadmap
Define and own the product strategy for the Data & Analytics platform to support business growth in the DairyTech domain.
Partner with business stakeholders to identify data-driven opportunities, ensuring alignment with organizational goals.
Prioritize features, enhancements, and integrations based on impact, feasibility, and scalability.
Data Platform & Engineering
Oversee design and development of data warehouse, data lakes, and streaming platforms leveraging Big Data technologies.
Work closely with engineering teams to implement Kafka , ETL frameworks, and scalable architecture for real-time and batch processing.
Ensure data quality, governance, and compliance across all layers of the ecosystem.
Analytics & Visualization
Drive the development of BI dashboards and reporting solutions using Power BI, Tableau, and other visualization tools.
Translate raw data into actionable insights for operations, supply chain optimization, herd management, sales forecasting, and customer experience.
Collaborate with data science teams to support predictive and prescriptive analytics initiatives.
Exposure to AI/ML is nice to have.
Stakeholder Management & Leadership
Act as the bridge between business teams, engineering, and leadership, ensuring effective communication and collaboration.
Mentor and guide cross-functional teams, instilling best practices in data-driven product management.
Partner with external vendors and technology providers to enhance the data ecosystem.
Qualifications & Skills :
Education: Bachelor’s/Master’s in Computer Science, Engineering, Data Science, or related field. MBA preferred.
Experience: 15+ years in Project Management, Data Engineering, or Analytics, with proven leadership in building and scaling data-driven platforms in Product or IT services firms, managing huge data warehouses and reporting teams.
Technical Expertise:
Strong knowledge of Data Warehousing, Big Data frameworks, Kafka, Cloud Data Platforms (AWS/Azure/GCP).
Hands-on exposure to BI tools (Power BI, Tableau, Qlik) and dashboard design.
Experience with ETL processes, data modeling, and governance frameworks.
Understanding of Machine Learning and AI to provide Actionable Insights based on Data.
Domain Knowledge: Prior experience in DairyTech, AgriTech, FMCG, or food & beverage industries , Retail Nice to have.
Soft Skills : Excellent stakeholder management, strong problem-solving, analytical mindset, and ability to lead diverse teams.
Data Integration Engineer
Posted 4 days ago
Job Viewed
Job Description
Key Responsibilities:
· Develop and maintain ETL workflows using Informatica.
· Design and implement data pipelines for ingestion, transformation, and loading.
· Work with SQL and Python to manipulate and analyse data.
· Integrate data across various systems and platforms, including GCP and BigQuery.
· Ensure data quality, consistency, and security across all integrations.
· Collaborate with data architects, analysts, and business stakeholders.
Required Skills:
· Strong experience with Informatica and ETL development.
· Proficiency in Python and SQL.
· Hands-on experience with Google Cloud Platform (GCP) and Big Query.
· Solid understanding of data integration best practices and performance optimization.
SAP Data Integration
Posted 12 days ago
Job Viewed
Job Description
This is a remote position.
Duration: 6 months Location: Remote Timings: Full Time (As per company timings) Notice Period: (Immediate Joiner - Only) Experience: 6-9 Years JD: We seek a Senior Data Integration Developer with deep expertise in SAP Data Intelligence to support a large-scale enterprise data program. You will be responsible for designing, building, and optimizing SAP DI pipelines for data ingestion, transformation, and integration across multiple systems. Key Responsibilities Design, develop, and deploy data integration pipelines in SAP Data Intelligence. Integrate SAP and non-SAP data sources, ensuring scalability and performance. Implement data quality checks, metadata management, and monitoring. Collaborate with MDM teams, functional consultants, and business analysts to meet integration requirements. Troubleshoot issues and optimize workflows for efficiency. Prepare technical documentation and handover materials. 6+ years of data integration experience, with at least 3 years in SAP Data Intelligence. Strong skills in SAP DI Graphs, Operators, and connectivity with SAP HANA, S/4HANA, and cloud platforms. Experience with data transformation, cleansing, and enrichment processes. Proficiency in Python, SQL, and integration protocols (REST, OData, JDBC). Strong problem-solving and debugging skills.Data Integration Architect
Posted 9 days ago
Job Viewed
Job Description
The Global Power Market is amidst a fundamental transition from a central (Predictable, vertically integrated, one way) to a distributed (Intermittent, horizontally networked, bidirectional) model with increasing penetration of Renewables playing a key role in this transition.
RILs newly created Distributed Renewables (RE) business intends to accelerate this transition by providing safe, reliable, affordable, and accessible distributed green energy solutions to Indias population thereby improving quality of life.
Digital is the key enabler for the business to scale-up through the 3 pillars of agility, delightful customer experience and data driven decision making.
Work Location : Navi Mumbai
Department: Digital, Distributed Renewable Energy
Reporting to: Head, Digital Initiatives, Distributed Renewables
Job Overview:
We are seeking a highly skilled and experienced Data and Integration Architect to join our team. This role is crucial for designing and implementing robust data and integration architectures that support our company's strategic goals. The ideal candidate will possess a deep understanding of data architecture, data modeling, integration patterns, and the latest technologies in data integration and management. This position requires a strategic thinker who can collaborate with various stakeholders to ensure our data and integration frameworks are scalable, secure, and aligned with business needs.
Key Responsibilities:
1. Data Architecture Design : Develop and maintain an enterprise data architecture strategy that supports business objectives and aligns with the companys technology roadmap.
2. Integration Architecture Development: Design and implement integration solutions that seamlessly connect disparate systems both internally and with external partners, ensuring data consistency and accessibility.
3. Data Governance and Compliance: Establish and enforce data governance policies and procedures to ensure data integrity, quality, security, and compliance with relevant regulations.
4. System Evaluation and Selection: Evaluate and recommend technologies and platforms for data integration, management, and analytics, ensuring they meet the organizations needs.
5. Collaboration with IT and Business Teams: Work closely with IT teams, business analysts, and external partners to understand data and integration requirements and translate them into architectural solutions.
6. Performance and Scalability: Ensure the data and integration architecture supports high performance and scalability, addressing future growth and technology evolution.
7. Best Practices and Standards: Advocate for and implement industry best practices and standards in data management, integration, and architecture design.
8. Troubleshooting and Optimization: Identify and address data and integration bottlenecks, performing regular system audits and optimizations to improve performance and efficiency.
9. Documentation and Training: Develop comprehensive documentation for the data and integration architectures. Provide training and mentorship to IT staff and stakeholders on best practices.
Qualifications:
1. Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field.
2. Minimum of 7 years of experience in data architecture, integration, or a related field, with a proven track record of designing and implementing large-scale data and integration solutions.
3. Expert knowledge of data modeling, data warehousing, ETL processes, and integration patterns (APIs, microservices, messaging).
4. Experience with cloud-based data and integration platforms (e.g., AWS, Azure, Google Cloud Platform) and understanding of SaaS, PaaS, and IaaS models.
5. Strong understanding of data governance, data quality management, and compliance regulations (e.g., GDPR, HIPAA).
6. Proficient in SQL and NoSQL databases, data integration tools (e.g., Informatica, Talend, MuleSoft), and data visualization tools (e.g., Tableau, Power BI).
7. Excellent analytical, problem-solving, and project management skills.
8. Outstanding communication and interpersonal abilities, with the skill to articulate complex technical concepts to non-technical stakeholders.
What We Offer:
1. Opportunities for professional growth and advancement.
2. A dynamic and innovative work environment with a strong focus on collaboration and continuous learning.
3. The chance to work on cutting-edge projects, making a significant impact on the companys data strategy and operations.
This position offers an exciting opportunity for a seasoned Data and Integration Architect to play a key role in shaping the future of our data and integration strategies. If you are passionate about leveraging data to drive business success and thrive in a dynamic and collaborative environment, we encourage you to apply.
Data Engineer-Data Integration
Posted today
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing.
Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets.
In this role, your responsibilities may include:
* Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques
* Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements
* Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors.
* Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results
Your primary responsibilities include:
* Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools.
* Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT.
* Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues.
**Required technical and professional expertise**
* Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter
* Knowledge of Cloud, Power BI, Data migration on cloud skills.
* Experience in Unix shell scripting and python
* Experience with relational SQL, Big Data etc
**Preferred technical and professional experience**
* Knowledge of MS-Azure Cloud
* Experience in Informatica PowerCenter
* Experience in Unix shell scripting and python
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Be The First To Know
About the latest Etl development Jobs in India !
Data Engineer-Data Integration
Posted today
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client's needs.
* Your primary responsibilities include:
* Design, build, optimize and support new and existing data models and ETL processes based on our client's business requirements
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyse and model data to ensure optimal ETL design and performance.
* Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components.
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality Documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Data Engineer-Data Integration
Posted 1 day ago
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing.
Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets.
In this role, your responsibilities may include:
* Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques
* Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements
* Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors.
* Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results
Your primary responsibilities include:
* Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools.
* Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT.
* Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues.
**Required technical and professional expertise**
* Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter
* Knowledge of Cloud, Power BI, Data migration on cloud skills.
* Experience in Unix shell scripting and python
* Experience with relational SQL, Big Data etc
**Preferred technical and professional experience**
* Knowledge of MS-Azure Cloud
* Experience in Informatica PowerCenter
* Experience in Unix shell scripting and python
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Data Engineer-Data Integration
Posted 2 days ago
Job Viewed
Job Description
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.
* Your primary responsibilities include:
* Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements.
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* aImplement data quality and validation processes within Ab Initio. Data Modeling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyze and model data to ensure optimal ETL design and performance.
* Ab Initio Components: Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality documentation
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.