3,174 Etl Engineer jobs in India
Senior ETL Engineer
Posted today
Job Viewed
Job Description
We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. This role is crucial in ensuring the integrity, usability, and performance of our data solutions. The ideal candidate will have extensive experience with ETL processes, database design, and Informatica PowerCenter/IICS.
Key Responsibilities :
ETL Development and Maintenance :
- Engage with stakeholders to understand business objectives and design effective ETL processes aligned with organizational goals.
- Maintain existing ETL processes ensuring data accuracy and adequate process performance
Data Warehouse Design & Development :
- Develop and maintain essential database objects, including tables, views, and stored procedures to support data analysis and reporting functions.
- Proficiently utilize SQL queries to retrieve and manipulate data as required.
Data Quality and Analysis :
- Analyze datasets to identify gaps, inconsistencies, and other quality issues, and devise strategic solutions to enhance data quality.
- Implement data quality improvement strategies to ensure the accuracy and reliability of data.
Performance Optimization :
- Investigate and resolve database and query performance issues to ensure optimal system functionality.
- Continuously monitor system performance and make recommendations for improvements.
Business Collaboration :
- Collaborate with business users to gather comprehensive data and reporting requirements.
- Facilitate user-acceptance testing in conjunction with business, resolving any issues that arise.
- Bachelors or Masters degree in Computer Science, Information Technology, or a related field.
- Minimum of 5 years of hands-on experience with Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS).
- Proven expertise in designing, implementing, and managing ETL processes and data warehouses.
- Proficiency with SQL and experience in optimizing queries for performance.
- Strong analytical skills with the ability to diagnose data issues and recommend comprehensive solutions.
- Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams.
- Detail-oriented with strong problem-solving capabilities.
Skills Required
Sql, Data Modeling, Etl Tools, Python, Data Warehousing, Apache Spark, Cloud Technologies, Data Integration
Data ETL Engineer
Posted today
Job Viewed
Job Description
Responsibilities:
Requirements:
Pluses:
Senior ETL Engineer
Posted today
Job Viewed
Job Description
Company Description
Strategy (Nasdaq: MSTR) is at the forefront of transforming organizations into intelligent enterprises through data-driven innovation. We don't just follow trends—we set them and drive change. As a market leader in enterprise analytics and mobility software, we've pioneered BI and analytics space, empowering people to make better decisions and revolutionizing how businesses operate.
But that's not all. Strategy is also leading to a groundbreaking shift in how companies approach their treasury reserve strategy, boldly adopting Bitcoin as a key asset. This visionary move is reshaping the financial landscape and solidifying our position as a forward-thinking, innovative force in the market. Four years after adopting the Bitcoin Standard, Strategy's stock has outperformed every company in S&P 500.
Our people are the core of our success. At Strategy, you'll join a team of smart, creative minds working on dynamic projects with cutting-edge technologies. We thrive on curiosity, innovation, and a relentless pursuit of excellence.
Our corporate values—bold, agile, engaged, impactful, and united—are the foundation of our culture. As we lead the charge into the new era of AI and financial innovation, we foster an environment where every employee's contributions are recognized and valued.
Join us and be part of an organization that lives and breathes innovation every day. At Strategy, you're not just another employee; you're a crucial part of a mission to push the boundaries of analytics and redefine financial investment.
Position Overview :
Job Location: Working Full time from Strategy Pune office.
We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. This role is crucial in ensuring the integrity, usability, and performance of our data solutions. The ideal candidate will have extensive experience with ETL processes, database design, and Informatica PowerCenter/IICS.
Key Responsibilities :
ETL Development and Maintenance :
Engage with stakeholders to understand business objectives and design effective ETL processes aligned with organizational goals.
Maintain existing ETL processes ensuring data accuracy and adequate process performance
Data Warehouse Design & Development :
Develop and maintain essential database objects, including tables, views, and stored procedures to support data analysis and reporting functions.
Proficiently utilize SQL queries to retrieve and manipulate data as required.
Data Quality and Analysis :
Analyze datasets to identify gaps, inconsistencies, and other quality issues, and devise strategic solutions to enhance data quality.
Implement data quality improvement strategies to ensure the accuracy and reliability of data.
Performance Optimization :
Investigate and resolve database and query performance issues to ensure optimal system functionality.
Continuously monitor system performance and make recommendations for improvements.
Business Collaboration :
Collaborate with business users to gather comprehensive data and reporting requirements.
Facilitate user-acceptance testing in conjunction with business, resolving any issues that arise.
Qualifications :
Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
Minimum of 5 years of hands-on experience with Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS).
Proven expertise in designing, implementing, and managing ETL processes and data warehouses.
Proficiency with SQL and experience in optimizing queries for performance.
Strong analytical skills with the ability to diagnose data issues and recommend comprehensive solutions.
Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams.
Detail-oriented with strong problem-solving capabilities.
Additional Information
The recruitment process includes online assessments as a first step (English, logic, business) - we send them via e-mail, please check also your SPAM folder
Array
Platform/ETL Engineer
Posted today
Job Viewed
Job Description
Senior Consultant - Platform Engineer
Responsibilities
• Design, build, and optimize platform solutions in Azure Databricks to support scalable, high-performance, and secure data engineering workflows.
• Implement and maintain platform automation for infrastructure, monitoring, and deployment pipelines using modern DevOps practices.
• Align with best practices for platform architecture, governance, and security within Databricks.
• Collaborate with data engineers, modellers, and testers to ensure the platform meets technical and business requirements.
• Proactively monitor, troubleshoot, and resolve issues to maintain platform stability and performance.
Role Requirements
• Strong platform engineering experience with Azure Databricks and related Azure services.
• Familiarity with DevOps principles and CI/CD pipeline tools (e.g., Azure DevOps, Jenkins, GitHub Actions).
- Background in system Azure architecture, networking, Windows and Linux configuration, DevOps or SRE roles
- • Ability to design scalable solutions using Databricks features such as Delta Lake, Unity Catalog, and cluster policies.
- Write and deploy infrastructure as Code (IaC) /Configuration as Code (CaC) (Azure Resource Manager, Puppet, Ansible, Terraform or similar)
- • Strong problem-solving skills, with the ability to think strategically and avoid tactical or short-term solutions when possible.
- • Experience working in Agile/Scrum teams.
Experience/Skillset
• 8+ years of experience in platform engineering or a related field.
• Strong knowledge of Databricks, including Spark optimization, cluster management, and governance frameworks.
- Good knowledge of Azure DevOps based toolchains
- • Proficiency in Python, SQL, and PySpark, with the ability to implement robust and efficient solutions in Databricks.
- • Expertise in Azure services such as Azure Data Factory, Azure Storage, and Azure Key Vault.
- • Strong knowledge of SDLC and Agile methodologies.
- • Excellent verbal and written communication skills.
- • Proven ability to manage multiple projects and priorities effectively while delivering to tight deadlines.
Skills Required
Machine Learning, Llm
Cloud & ETL Engineer Lead Sr
Posted today
Job Viewed
Job Description
• Experience in Informatica (Power center/Power Mart/Power Exchange).
• Experience in SQL Server with CUBE
• Extensively experience on ETL Informatica Transformations using Lookup, Filter, Expression, Router, Normalizer, Joiner, Update, Rank, Aggregator, Stored Procedure, Sorter, and Sequence Generator and created complex mappings.
• Experience in implementing the entire life cycle of the project
• Experience in Unix Shell Scripts for automation of the ETL process.
• Should be able to provide the end to end architecture of the ETL process for loading staging/landing zone and audit control process.
• Experience in creating detailed technical design documents, including the Source to target mapping docs.
• Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins, and hash indexes in Oracle/Teradata and Postgre SQL databases.
• Expertise in Data modeling techniques like Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
• Experienced in the Performance of Mapping Optimizations in Informatica.
Senior ETL Engineer/Consultant Specialist
Posted today
Job Viewed
Job Description
Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of Consultant Specialist
In this role, you will:
To be successful in this role, you should meet the following requirements:
ETL Automation Engineer
Posted 3 days ago
Job Viewed
Job Description
As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems.
Responsibilities
- Develop and Implement Automation Frameworks: Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments.
- Test Strategy and Execution: Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources.
- Data Validation: Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic.
- Performance Testing: Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies.
- Collaborate with Teams: Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly.
- Continuous Integration: Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process.
- Defect Tracking and Reporting: Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner.
- Test Data Management: Develop strategies for handling large volumes of test data while maintaining data security and privacy.
- Tool and Technology Evaluation: Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices.
Job Qualifications:
Requirements and skills
· At Least 4+ Years Experience
- Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.).
· Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing.
· Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations.
· Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes.
· Performance Testing
· Experience with version control systems like Git
· Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.
· Strong communication and collaboration skills.
· Attention to detail and a passion for delivering high-quality solutions.
· Ability to work in a fast-paced environment and manage multiple priorities.
· Enthusiastic about learning new technologies and frameworks.
Experience with the following tools and technologies are desired.
- QLIK Replicate
- Matillion ETL
- Snowflake
- Data Vault Warehouse Design
- Power BI
- Azure Cloud – Including Logic App, Azure Functions, ADF
Be The First To Know
About the latest Etl engineer Jobs in India !
ETL Automation Engineer
Posted today
Job Viewed
Job Description
As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems.
Responsibilities
- Develop and Implement Automation Frameworks: Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments.
- Test Strategy and Execution: Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources.
- Data Validation: Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic.
- Performance Testing: Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies.
- Collaborate with Teams: Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly.
- Continuous Integration: Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process.
- Defect Tracking and Reporting: Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner.
- Test Data Management: Develop strategies for handling large volumes of test data while maintaining data security and privacy.
- Tool and Technology Evaluation: Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices.
Job Qualifications:
Requirements and skills
· At Least 4+ Years Experience
- Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.).
· Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing.
· Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations.
· Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes.
· Performance Testing
· Experience with version control systems like Git
· Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.
· Strong communication and collaboration skills.
· Attention to detail and a passion for delivering high-quality solutions.
· Ability to work in a fast-paced environment and manage multiple priorities.
· Enthusiastic about learning new technologies and frameworks.
Experience with the following tools and technologies are desired.
- QLIK Replicate
- Matillion ETL
- Snowflake
- Data Vault Warehouse Design
- Power BI
- Azure Cloud – Including Logic App, Azure Functions, ADF
ETL Automation Engineer
Posted 2 days ago
Job Viewed
Job Description
Responsibilities
Develop and Implement Automation Frameworks: Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments.
Test Strategy and Execution: Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources.
Data Validation: Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic.
Performance Testing: Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies.
Collaborate with Teams: Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly.
Continuous Integration: Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process.
Defect Tracking and Reporting: Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner.
Test Data Management: Develop strategies for handling large volumes of test data while maintaining data security and privacy.
Tool and Technology Evaluation: Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices.
Job Qualifications:
Requirements and skills
· At Least 4+ Years Experience
Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.).
· Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing.
· Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations.
· Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes.
· Performance Testing
· Experience with version control systems like Git
· Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.
· Strong communication and collaboration skills.
· Attention to detail and a passion for delivering high-quality solutions.
· Ability to work in a fast-paced environment and manage multiple priorities.
· Enthusiastic about learning new technologies and frameworks.
Experience with the following tools and technologies are desired.
QLIK Replicate
Matillion ETL
Snowflake
Data Vault Warehouse Design
Power BI
Azure Cloud – Including Logic App, Azure Functions, ADF
ETL Informatica Engineer
Posted today
Job Viewed
Job Description
Job Description
Req ID: 314638
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a ETL Informatica Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN).
"Job Duties: As a Data Engineer, you will be a member of the Controls Engineering, Measurement and Analytics (CEMA)-Technology Risk Solutions development team, with specific focus on sourcing data and developing data solutions for Legal, Compliance, Audit, Risk Management functions within Morgan Stanley. In this role you will be primarily responsible for the development of data workflows, views, and stored procedures, in addition to performing data analysis, and monitoring and tuning queries and data loads. You will be working closely with data providers, data scientists, data developers, and data analytics teams to facilitate the implementation of client-specific business requirements and requests.
YOUR KEY RESPONSIBILITIES:
• To develop ETLs, stored procedures, triggers, and views on our DB2-based Data Warehouse
• To perform data profiling and analysis on source system data to ensure that source system data can be integrated and represented properly in our models
• To monitor the performance of queries and data loads and perform tuning as necessary
• To provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues
SKILLS / QUALIFICATIONS
• Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field required
• At least 5+ years of experience in data development and solutions in highly complex data environments with large data volumes
• At least 5+ years of experience developing complex ETLs with Informatica PowerCenter
• At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis
• At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus)
• Experience with performance tuning DB2 tables, queries, and stored procedures
• Experience with scripting languages like Perl and Python
• Experience with data ingestion into Hadoop a plus
• Experience with Autosys or Airflow a plus
• An understanding of E-R data models (conceptual, logical and physical)
• Strong understanding of advanced data warehouse concepts (Factless Fact Tables, Temporal BI-Temporal models, etc.)
• Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions
• Experience with both Waterfall and Agile development methodologies
• Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.
• Self starter. Proven ability to manage multiple, concurrent projects with minimal supervision. Can manage a complex ever changing priority list and resolve conflicts to competing priorities.
• Strong problem solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements and priorities.
Minimum Skills Required: Must have strong Informatica experience (hands-on experience):
• At least 5+ years of experience developing complex ETLs with Informatica PowerCenter
• At least 5+ years of experience in data development and solutions in highly complex data environments with large data volumes
•At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis
• At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus)"
#LI-CDL
About NTT DATA
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com
NTT DATA endeavors to make to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
ApplyListen to the story of Employee Voice
Annette Barnabas
Business Analysis Associate DirectorIndia
Read more
Vanathi Asok
Director – Project and Application ServicesIndia
Read more
Swathi Sujir
Business Operations SupervisorIndia
Read more
Apply Back to search results