1,324 Etl jobs in India
ETL Data Integration Specialist
Posted today
Job Viewed
Job Description
PTR Global is a leader in providing innovative workforce solutions, dedicated to optimizing talent acquisition and management processes.
We cultivate a dynamic and collaborative environment that empowers our employees to excel and contribute to our clients' success.
Job SummaryWe are seeking a highly skilled ETL Developer with expertise in designing, developing, and maintaining ETL processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives.
The ideal candidate will possess proficiency in T-SQL, Azure Data Factory (ADF), and SSIS, along with excellent problem-solving and communication skills.
Key Responsibilities- Data Integration Specialist: Design, develop, and maintain ETL processes to support data integration and business intelligence initiatives.
- T-SQL Expert: Utilize T-SQL to write complex queries and stored procedures for data extraction and transformation.
- SSIS Developer: Implement and manage ETL processes using SSIS (SQL Server Integration Services).
- Data Warehouse Designer: Design and model data warehouses to support reporting and analytics needs.
- Quality Assurance Specialist: Ensure data accuracy, quality, and integrity through effective testing and validation procedures.
- Collaborator: Collaborate with business analysts and stakeholders to understand data requirements and deliver solutions that meet their needs.
- Process Monitor: Monitor and troubleshoot ETL processes to ensure optimal performance and resolve any issues promptly.
- Documentation Specialist: Document ETL processes, workflows, and data mappings to ensure clarity and maintainability.
- Minimum 4+ years of experience: As an ETL Developer or in a similar role.
- T-SQL Proficiency: For writing complex queries and stored procedures.
- SSIS Experience: For developing and managing ETL processes.
- ADF Knowledge: And its application in ETL processes.
- Data Warehouse Design: Experience in data warehouse design and modeling.
- Azure Cloud Suite: Knowledge of Microsoft's Azure cloud suite, including Data Factory, Data Storage, Blob Storage, Power BI, and Power Automate.
- Problem-Solving Skills: Strong problem-solving and analytical skills.
- Communication Skills: Excellent communication and interpersonal skills.
- Attention to Detail: Strong attention to detail and commitment to data quality.
- Educational Background: Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.
ETL Data Integration Specialist
Posted today
Job Viewed
Job Description
We are seeking a highly skilled ETL data integration specialist to join our team in Chennai. The ideal candidate will be responsible for designing, developing, and maintaining data extraction, transformation, and loading (ETL) processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives.
- Key Responsibilities:
- Design and develop ETL processes using T-SQL, Azure Data Factory (ADF), and SQL Server Integration Services (SSIS).
- Implement and manage data warehousing and reporting solutions to meet the needs of stakeholders.
- Collaborate with business analysts and data scientists to understand data requirements and deliver solutions that meet their needs.
- Maintain and improve existing ETL processes to ensure optimal performance and data quality.
Requirements:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Minimum 4+ years of experience as an ETL developer or in a similar role.
- Proficiency in T-SQL, ADF, SSIS, and data warehouse design and modeling.
- Strong problem-solving and analytical skills.
- Excellent communication and interpersonal skills.
Etl
Posted today
Job Viewed
Job Description
- 2. Data testing lead activities
- 3. Experience working in an agile development environment, familiarity with JIRA
- 4. Able to write SQL (Experience with PostgreSQL preferred)
- 5. Knowledge of automation and scripting using Python
- 6. Familiarity with Selenium testing automation via Python
- 7. Familiarity with key financial industry concepts, types of issuers and securities, preferred
- 8. Experience with Alteryx preferred
- 9. Experience with Q-Test Preferred
- 10. Analyze and debug any PROD failures during batches and coordinate with corresponding teams for resolution
- 11. Status reporting
- 12. Escalate any issues to Infosys/CGC management team
- 13. Takes bottom line responsibility for timely and high quality test planning & execution
- 14. Coordinates meetings with stakeholders
- 15. Test planning, tracking and status reporting
- Participates in test process/methodology preparation, requirements, reviews and overall project meetings, etc.
Etl
Posted today
Job Viewed
Job Description
Required Experience & Technical Stack
- Data Engineer with 4+ years of experience in a Data Engineer role, who has
attained a Graduate degree in Computer Science, Statistics, Informatics,
Information Systems or another quantitative field.
- Expertise in building robust APIs.
- Prelim or Similar No-Code Applications Experience Required.
- Experience building complex SQLs in Snowflake or similar RDMS.
- Experience building complex ETL Workflows in Snowflakes/Matillion or any
similar ETL.
- 2+ years of hands-on Power BI experience
- 2+ years of hands-on Python experience
- Experience in Matillion will be a preference.
- Good Experience in building Power BI dashboards or Similar BI Tools.
- Should possess good problem-solving skills. i.e, Backtracking data and API
issues.
- Should have sound knowledge of data warehousing / Solution Design.
- Hands-on experience with ETL, Big Data (Hadoop, Spark, Kafka, Etc.)
- Working experience with Snowflakes or Matillion
- Should have worked on relational SQL/ NoSQL databases, including Postgres
& Cassandra.
- Hands-on working with Stream-processing Systems (Storm, Spark-Streaming,
Etc.)
- Object-oriented/object function scripting languages (Python, Java, C++,
Scala, Etc.)
- Advanced working SQL knowledge and experience working with relational
databases, query authoring (SQL) as well as working familiarity with a variety
of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures
and data sets.
- Experience performing root cause analysis on internal and external data and
processes to answer specific business questions and identify opportunities
for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata,
dependency and workload management.
- Experience- 3 - 5 Years- Salary- 8 Lac To 15 Lac P.A.- Industry- IT Software - DataBase / Datawarehousing- Qualification- Other Bachelor Degree, M.C.A- Key Skills- powerbi Python etl ETL snowflakes big data Data Engineer matillion Postgre SQL cassandraAbout Company
- Company NameHyderabad based IT company
- About Company- We are an innovative technology firm, with leadership over 75 years of experience With our 24/7 customer service, Our award-winning cloud solutions, blueprints, and processes can help your business achieve the ROI you are looking for while reducing the risk.- Contact Person- Mittal Shah- Mobile-
Etl
Posted today
Job Viewed
Job Description
Date Opened
**05/10/2023**
Job Type
**Permanent**
RSD NO
**6530**
Industry
**IT Services**
City
**chennai**
State/Province
**tamilnadu**
Country
**India**
Zip/Postal Code
**000**
- Role title : ETL Tester
Job Description:
7+ Years of Exp in ETL Testing is Must
Educational Qualification : Degree
Work Location: Chennai
Work Mode: Hybrid Mode
Roles & Responsibilities:
- Experience in reporting tool validation
- Expertise in SQL queries to retrieve and validate the data from multiple sources.
- Retail domain experience.
- Experience in snowflake or other cloud data platforms
Mandate Skill : ETL Testing
ETL Developer

Posted today
Job Viewed
Job Description
We are looking for an experienced Data ETL Developer / BI Engineer who loves solving complex problems across a full spectrum of data & technologies. You will lead the building effort of GBT's new BI platform and manage the legacy platform to seamlessly support our business function around data and analytics. You will create dashboards, databases, and other platforms that allow for the efficient collection and evaluation of BI data.
**What You'll Do on a Typical Day:**
+ Design, implement, and maintain systems that collect and analyze business intelligence data.
+ Design and architect an analytical data store or cluster for the enterprise and implement data pipelines that extract, transform, and load data into an information product that helps the organization reach strategic goals.
+ Create physical and logical data models to store and share data that can be easily consumed for different BI needs.
+ Develop Tableau dashboards and features.
+ Create scalable and high-performance data load and management process to make data available near real-time to support on-demand analytics and insights.
+ Translate complex technical and functional requirements into detailed designs.
+ Investigate and analyze alternative solutions to data storing, processing, etc., to ensure the most streamlined approaches are implemented.
+ Serve as a mentor to junior staff by conducting technical training sessions and reviewing project outputs
+ Design & develop, and maintain a data model implementing ETL processes.
+ Manage and maintain the database, warehouse, & cluster with other dependent infrastructure.
+ Work closely with data, products, and another team to implement data analytic solutions.
+ Support production application and Incident management.
+ Help define data governance policies and support data versioning processes
+ Maintain security and data privacy by working closely with the Data Protection Officer internally.
+ Analyze a vast number of data stores and uncover insights
**What We're Looking For:**
+ Degree in computer sciences or engineering
+ Overall, 3-5 years of experience in data & data warehouse, ETL, and data modeling.
+ 2+ years of experience working and managing large data stores, complex data pipelines, and BI solutions.
+ Strong experience in SQL and writing complex queries.
+ Hands-on experience with Tableau development.
+ Hands-on working experience on Redshift, data modeling, data warehouse, ETL tool, Python, and Shell scripting.
+ Understanding of data warehousing and data modeling techniques
+ Strong data engineering skills on the AWS Cloud Platform are essential.
+ Knowledge of Linux, SQL, and any scripting language
+ Good interpersonal skills and a positive attitude
+ Experience in travel data would be a plus.
**Location**
Gurgaon, India
**The #TeamGBT Experience**
Work and life: Find your happy medium at Amex GBT.
+ **Flexible benefits** are tailored to each country and start the day you do. These include health and welfare insurance plans, retirement programs, parental leave, adoption assistance, and wellbeing resources to support you and your immediate family.
+ **Travel perks:** get a choice of deals each week from major travel providers on everything from flights to hotels to cruises and car rentals.
+ **Develop the skills you want** when the time is right for you, with access to over 20,000 courses on our learning platform, leadership courses, and new job openings available to internal candidates first.
+ **We strive to champion Inclusion** in every aspect of our business at Amex GBT. You can connect with colleagues through our global INclusion Groups, centered around common identities or initiatives, to discuss challenges, obstacles, achievements, and drive company awareness and action.
+ And much more!
All applicants will receive equal consideration for employment without regard to age, sex, gender (and characteristics related to sex and gender), pregnancy (and related medical conditions), race, color, citizenship, religion, disability, or any other class or characteristic protected by law.
Click Here ( for Additional Disclosures in Accordance with the LA County Fair Chance Ordinance.
Furthermore, we are committed to providing reasonable accommodation to qualified individuals with disabilities. Please let your recruiter know if you need an accommodation at any point during the hiring process. For details regarding how we protect your data, please consult the Amex GBT Recruitment Privacy Statement ( .
**What if I don't meet every requirement?** If you're passionate about our mission and believe you'd be a phenomenal addition to our team, don't worry about "checking every box;" please apply anyway. You may be exactly the person we're looking for!
Click Here to Learn More (
ETL Developer

Posted today
Job Viewed
Job Description
About Allegis Global Solutions
We are founded on a culture that is passionate about transforming the way the world acquires talent by delivering client-focused solutions that make a difference for businesses worldwide. From refining how businesses manage their contingent workforce to strengthening employer brands to recruit top talent, our integrated solutions drive business results. As an industry leader, we draw upon decades of experience to design innovative tools, products, and processes. We develop competitive practices that position organizations for growth and we deliver the insight needed to succeed in today's global marketplace. As a workplace, we focus on relationships - with each other, our clients and our candidates. In fact, serving others is one of our core values. We support open communication and recognize that giving constructive criticism can be even harder than receiving it. We appreciate the fearless and the passionate, who force us to be better. Everything we do sits on a pillar of diversity - diverse perspectives, backgrounds, and ideas drive innovation and make us successful. See what it's like to work at AGS by searching #LifeAtAGS on any social network.
FOR U.S. ONLY
AGS is an Equal Opportunity/Affirmative Action Employer (M/F/Disability/Veterans). We will consider all applications without regard to race, gender, sexual orientation, gender identity, age, color, religion, national origin, veteran status, disability, genetic information or any other status protected by applicable law. If you would like to request a reasonable accommodation, such as the modification or adjustment of the job application process or interviewing process due to disability, please call or email
Job Description
Job Summary
The Developer role is responsible for developing and supporting integration and automation solutions using Informatica, Unix scripting, PowerCenter mappings and workflows, IDQ mappings and mapplets, UDO transformations, and creation of complex XSDs. Serve as Informatica subject matter expert for team members and clients.
Responsibilities
+ Responsible for establishing, developing, updating internal Enterprise software development standards and best practices for ETL development using Informatica and extension (including but not limited to Power Center mappings & workflow, IDQ mappings & mapplets, UDO transformations, creation of complex XSDs), Matillion, Snowflake, UNIX, Java, Shell scripting, Salesforce, Azure and Tableau)
+ Analyze, plan, design, develop, test and implement innovative methods for advancing ETL processes.
+ Responsible for partnering with Technical Manager to follow established Enterprise Software Development Life Cycle (SDLC) standards and maintaining the successful adherence to these standards across entire Technology Solutions team
+ Monitors for technology advancements impacting Integration Center related tools (Informatica, Tableau, and Azure, etc.) and make recommendations to drive efficiencies and ROI.
+ Utilize skills of communication, presentation, time management, organization and planning to successfully achieve team goals and objectives
+ Responsible for updating Integration Center documentation and best practices library, ensuring this is kept up-to-date for reference and training material
Qualifications
Qualifications, Skills, and Experience
+ Minimum Bachelor's degree in Computer Science or Information Systems
+ 7 + years of software development experience
+ 7+ years of experience using a relational database
+ 7 +years of ETL experience (analyze, plan, design, develop, test, implement, and change)
+ Expert-level experience with Oracle SQL, PL/SQL, Informatica Stack (including but not limited to Power Center mappings & workflow, IDQ mappings & mapplets, UDO transformations, creation of complex XSD), UNIX, Java, Shell scripting, Salesforce, and Tableau
+ 7+ years of experience in data analysis and troubleshooting
+ Hands-on experience with Snowflake databases and creating ELT jobs in Matillion
+ Ability to a manage a team with a high-volume workload, providing varying levels of support when required.
+ Expert knowledge of MS Office tools, including MS Excel, and MS Access
+ Experience with data warehouses and sound understanding of data warehousing principles
+ VBA development experience in Excel a plus
+ Must have strong technical, organizational and communication skills (both written and verbal)
+ Must have robust, problem-solving capabilities, strong analytical skills, be flexible and able to handle multiple tasks concurrently
+ Aptitude for learning new technologies and learning on the fly
Additional Information
As a workplace, we focus on relationships - with each other, our clients and our candidates - in fact serving others is one of our core values. We support open communication and recognize that giving constructive criticism can be even harder than receiving it. We appreciate the fearless and the passionate, who force us to be better. Everything we do sits on a pillar of diversity - diverse perspectives, backgrounds and ideas drive innovation and make us successful.
See what it's like to work at AGS by searching #LifeAtAGS on any social network.
Be The First To Know
About the latest Etl Jobs in India !
ETL Developer
Posted 3 days ago
Job Viewed
Job Description
Job Title: Genio OpenText ETL Developer
Location: India
Job Type: Full-Time
Experience - 4 to 8 Years
Job Summary:
We are looking for a talented Genio OpenText ETL Developer to join our team. The ideal candidate will have extensive experience in Extract, Transform, Load (ETL) processes using OpenText Genio. This role involves designing, developing, and maintaining ETL workflows to support data integration and migration projects.
Key Responsibilities:
· Design, develop, and maintain ETL processes using OpenText Genio.
· Collaborate with business analysts and data architects to understand data requirements and translate them into technical specifications.
· Implement data extraction, transformation, and loading processes to integrate data from various sources.
· Optimize ETL workflows for performance and scalability.
· Perform data quality checks and ensure data integrity throughout the ETL process.
· Troubleshoot and resolve ETL-related issues.
· Document ETL processes and maintain technical documentation.
· Provide support and guidance to junior ETL developers.
Qualifications:
· Bachelor’s degree in Computer Science, Information Technology, or a related field.
· Proven experience as an ETL Developer, with a focus on OpenText Genio.
· Strong knowledge of ETL concepts, data integration, and data warehousing.
· Proficiency in SQL and experience with database management systems.
· Familiarity with data modeling and data mapping techniques.
· Excellent problem-solving skills and attention to detail.
· Strong communication and teamwork abilities.
Preferred Qualifications:
· Experience with other ETL tools and technologies.
· Knowledge of Agile development methodologies.
Please share me your latest cv to along with your details.
Full Name:
Email ID:
Phone Number:
Alternative Contact No:
Current Location:
Preferred Location:
Current Company:
Availability time for the call:
Notice Period:
Total years of Experience:
Years of Experience in ETL Development:
Years of Experience in OpenText:
Years of Experience in Genio:
ETL Developer
Posted 3 days ago
Job Viewed
Job Description
Job Title: ETL Developer – DataStage, AWS, Snowflake
Experience: 5–7 Years
Location: (Remote)
Job Type: (Full-time )
Job Summary:
We are seeking a skilled ETL Developer with 5–7 years of experience in data integration and transformation using IBM DataStage , AWS services , and Snowflake . The ideal candidate should have a strong background in designing, developing, and deploying scalable ETL solutions. Experience with IBM Cloud Pak for Data (CP4D) is a plus.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines using IBM DataStage.
- Integrate and manage data workflows across AWS and Snowflake environments.
- Collaborate with data architects, analysts, and business users to understand data requirements.
- Optimize data loading and transformation processes for performance and scalability.
- Monitor and troubleshoot data jobs to ensure data integrity and availability.
- Contribute to data modeling and architecture discussions as needed.
- Maintain documentation of ETL processes, configurations, and data mappings.
Required Skills & Qualifications:
- 5–7 years of hands-on ETL development experience.
- Strong expertise in IBM DataStage (v11.7 or later preferred).
- Working knowledge of AWS services like S3, Lambda, Glue, Redshift, or equivalent.
- Experience with Snowflake data warehousing.
- Proficient in writing complex SQL queries and performance tuning.
- Familiarity with data quality, data governance, and best practices in data integration.
ETL Developer
Posted 3 days ago
Job Viewed
Job Description
Job Title: ETL Developer (AWS, Snowflake, IBM DataStage)
Experience: 5 to 7 Years
Location: India (Remote – Work from Home)
Education: B.Tech in Computer Science or related field
Employment Type: Full-Time
Job Summary:
We are seeking a skilled and experienced ETL Developer with 5–7 years of hands-on experience in building and managing ETL pipelines using AWS , Snowflake , and IBM DataStage . Experience with IBM Cloud Pak for Data (CP4D) is a plus. The ideal candidate should have a solid background in data integration, transformation, and cloud data platforms, with a focus on performance, scalability, and security.
Key Responsibilities:
- Design, develop, and maintain robust ETL pipelines using IBM DataStage , AWS Glue , and Snowflake .
- Build scalable data integration solutions to support enterprise-level data warehousing and analytics initiatives.
- Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver effective solutions.
- Optimize and monitor ETL workflows for performance and reliability.
- Develop and maintain data mappings, transformation rules, and data quality checks.
- Ensure secure and efficient data movement across cloud and on-premises environments.
- Document data processes, pipeline structures, and design patterns for future reference.
- Work with version control tools and CI/CD processes for code deployment.
- Engage in troubleshooting and debugging of ETL jobs and data pipeline issues.
Required Skills:
- 5–7 years of professional experience in ETL/Data Engineering roles.
- Strong experience with IBM DataStage development and deployment.
- Hands-on experience with AWS services (S3, Glue, Lambda, Redshift, etc.).
- Proficient in working with Snowflake – development, performance tuning, and data loading.
- Solid understanding of SQL, stored procedures, and data warehousing concepts.
- Experience with scheduling and orchestration tools (e.g., Control-M, Apache Airflow).
- Familiarity with Git or other version control systems.
- Strong analytical and problem-solving skills.
Good to Have:
- Experience with IBM Cloud Pak for Data (CP4D) platform.
- Knowledge of DevOps practices and CI/CD pipelines for data projects.
- Understanding of data governance, security, and compliance practices.
Benefits:
- Flexible remote work environment.
- Opportunity to work with cutting-edge data technologies.
- Collaborative and growth-oriented team culture.