ETL Developer
Posted 2 days ago
Job Viewed
Job Description
An ETL (Extract, Transform, Load) Developer designs, develops, and maintains data pipelines and data warehouses, extracting data from various sources, transforming it into a usable format, and loading it into target systems. Key responsibilities include collaborating with business teams to understand data needs, writing complex SQL queries, troubleshooting data issues, optimizing performance, ensuring data accuracy, and documenting processes to support analytics and reporting.
Core Responsibilities
- Data Extraction: Retrieving data from diverse sources like databases, applications, APIs, and files.
- Data Transformation: Converting raw data into a standardized, clean, and consistent format suitable for the target system and business analysis.
- Data Loading: Loading the transformed data into data warehouses or other target systems where it can be accessed for reporting and analytics.
- Designing and Developing ETL Processes: Building and implementing the entire ETL pipeline, including data flows, mappings, and workflows.
- Data Quality and Integrity: Ensuring the accuracy, consistency, and reliability of data throughout the ETL process.
- Performance Optimization: Tuning SQL queries, identifying and resolving performance bottlenecks, and optimizing data loading times.
- Troubleshooting and Debugging: Investigating and resolving any issues that arise within ETL processes or databases.
- Documentation: Creating comprehensive documentation for ETL designs, processes, and architectures for future reference.
Collaboration and Communication
- Working with Stakeholders: Collaborating with business analysts, data analysts, and other stakeholders to understand data requirements and business goals.
- Cross-functional Teams: Working closely with technical and business teams to translate requirements into effective data solutions.
System Maintenance and Improvement
- Maintaining ETL Workflows: Ensuring the ongoing health and performance of existing ETL jobs and processes.
- Implementing New Software: Staying updated with new technologies and incorporating them to improve ETL capabilities and data processing.
- Providing Training: Facilitating and training staff on ETL processes and best practices.
ETL Developer
Posted 2 days ago
Job Viewed
Job Description
Position Summary:
We are seeking a highly skilled ETL Developer with 5–8 years of experience in data integration, transformation, and pipeline optimization. This role is a key part of our Data Engineering function within the Business Intelligence team, responsible for enabling robust data flows that power enterprise dashboards, analytics, and machine learning models. The ideal candidate has strong SQL and scripting skills, hands-on experience with cloud ETL tools, and a passion for building scalable data infrastructure.
Education Qualification:
- B. Tech (CS, Elec), MCA or higher.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines that move and transform data across internal and external systems.
- Collaborate with data analysts, BI developers, and data scientists to support reporting, modeling, and insight generation.
- Build and optimize data models and data marts to support business KPIs and self-service BI.
- Ensure data quality, lineage, and consistency across multiple source systems.
- Monitor and tune performance of ETL workflows, troubleshoot bottlenecks and failures.
- Support the migration of on-premise ETL workloads to cloud data platforms (e.g., Snowflake, Redshift, BigQuery).
- Implement and enforce data governance, documentation, and operational best practices .
- Work with DevOps/DataOps teams to implement CI/CD for data pipelines .
Required Qualifications:
- 5–8 years of hands-on experience in ETL development or data engineering roles.
- Advanced SQL skills and experience with data wrangling on large datasets.
- Proficient with at least one ETL tool (e.g., Informatica , Talend , AWS Glue , SSIS , Apache Airflow , or Domo Magic ETL ).
- Familiarity with data modeling techniques (star/snowflake schemas, dimensional models).
- Experience working with cloud data platforms (e.g., AWS, Azure, GCP).
- Strong understanding of data warehouse concepts , performance optimization, and data partitioning.
- Experience with Python or scripting languages for data manipulation and automation.
Preferred Qualifications:
- Exposure to BI platforms like Domo, Power BI, or Tableau.
- Knowledge of CI/CD practices in a data engineering context (e.g., Git, Jenkins, dbt).
- Experience working in Agile/Scrum environments .
- Familiarity with data security and compliance standards (GDPR, HIPAA, etc.).
- Experience with API integrations and external data ingestion.
ETL Testing

Posted 1 day ago
Job Viewed
Job Description
**Experience:** 6 to 9 years
**Location** : Bhubaneswar
**Job Summary:** We are seeking a detail-oriented and experienced Testing Engineer to join our team. The ETL Testing Engineer will be responsible for designing, executing, and automating test cases to ensure the quality and integrity of data extracted, transformed, and loaded into data warehouses and other data repositories. This role involves working closely with developers, data engineers, and analysts to validate data transformations, business rules, and performance across the ETL process.
**Key Responsibilities**
**ETL Testing:**
+ Design, develop, and execute test cases for ETL processes to verify data extraction, transformation, and loading.
+ Validate data at different stages (source, transformation, and destination) to ensure the accuracy and completeness of data.
+ Identify and report data discrepancies, errors, and anomalies during the ETL process, ensuring that they are addressed promptly.
+ Verify data quality rules, business logic, and mappings used in the ETL processes.
**Data Validation**
+ Perform data validation and reconciliation between source and target systems to ensure data integrity.
+ Create SQL queries to validate data transformations and conduct data profiling to identify data quality issues.
+ Validate ETL performance, including the efficiency of data loads and transformations, and ensure that SLAs are met.
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
ETL Developer Analyst

Posted 1 day ago
Job Viewed
Job Description
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for ETL development in AbInitio or Talend. The overall objective of this role is to design, develop, and optimize ETL workflows and data integration solutions using Ab Initio or Talend. The role involves working closely with business and technology teams to ensure seamless data processing and transformation.
**Responsibilities:**
+ Design, develop, and implement ETL (Extract, Transform, Load) pipelines using Ab Initio or Talend.
+ Work with structured, semi-structured, and unstructured data from multiple sources.
+ Optimize data processing and transformation workflows for efficiency and scalability.
+ Troubleshoot and resolve performance issues in ETL processes.
+ Collaborate with data architects, analysts, and business teams to define data requirements.
+ Ensure data quality, integrity, and governance standards are met.
+ Develop and maintain **m** etadata and documentation for ETL processes.
+ Implement and manage job scheduling and automation **t** ools.
**Preferred Qualifications:**
+ Certifications in Ab Initio, Talend, or cloud technologies are a plus.
+ Experience with CI/CD pipelines for ETL deployment.
**Qualifications:**
+ 4-6 years of relevant experience working with Talend, Ab Initio (GDE, Express>IT, Conduct>IT) or Talend (Data Fabric, Open Studio, etc.). Strong knowledge of SQL, PL/SQL, and database systems (Oracle, SQL Server, PostgreSQL, etc.).
+ Experience with ETL optimization, debugging, and performance tuning.
+ Experience in API integration, web services, and cloud platforms (AWS, Azure, GCP) is a plus. Strong understanding of data warehousing concepts and ETL best practices. Hands-on experience with version control tools (Git, SVN, etc.).
+ Strong analytical and problem-solving skills. Excellent communication and teamwork skills.
+ Consistently demonstrates clear and concise written and verbal communication
+ Demonstrated problem-solving and decision-making skills
+ Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements
**Education:**
+ Bachelor's degree/University degree or equivalent experience
---
**Job Family Group:**
Technology
---
**Job Family:**
Applications Development
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
Python+Aws+ETL

Posted 1 day ago
Job Viewed
Job Description
**Role:** Python+AWS+ETL
**Experience:** 6 to 9 Years
**Location:** AIA Group - Bhubaneswar
· We are looking for a Software Lead with the following skill set:
· Strong hands-on experience with Python.
· Strong hands-on experience with data engineering and ETL/ELT.
· Good understanding of Big data engineering principles.
· Good understanding of core networking principles as well as cloud infra model.
· Good understanding on how to use AWS SDK (Python) for stitching up different workloads.
· Good understanding of Apache AirFlow.
· Good understanding of SnowFlake.
· Experience with AWS integration.
· Managed AWS AirFlow experience would be a plus.
· GIT, Jira.
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
Subject Matter Expert – ETL Developer
Posted 8 days ago
Job Viewed
Job Description
Job Title: Subject Matter Expert ETL Developer
Mode: Hybrid (3 Days Office + 2 Days Remote)
Job Type: Full-Time / Contract
Location:
- Washermenpet, Chennai
- Chinna Chokkikulam, Madurai
Shift Time: 1:00 PM to 10:00 PM IST
Experience Required: 5 to 9 Years
Joining: Immediate Joiners Preferred (Max 15 Days)
Payroll Company: Smiligence
Budget: Based on Experience
Holidays: As per US Calendar
Contact:
Job Overview
Smiligence is hiring a Senior ETL Developer with strong hands-on experience in Talend, PostgreSQL, AWS, and Linux . The ideal candidate will take complete ownership of data engineering projects, mentor team members, and drive best practices in ETL development and cloud data workflows.
Key Responsibilities
Core Functional Responsibilities
- Lead the design and development of scalable ETL workflows.
- Take ownership of project execution from requirement gathering to delivery.
- Conduct technical interviews and mentor junior developers.
- Create and test proof-of-concepts for data integration solutions.
- Assist in proposal preparation and client requirement analysis.
Technical Responsibilities
- Build ETL pipelines using Talend and PostgreSQL .
- Integrate structured and unstructured data from multiple sources.
- Develop scripts using Python or Shell in a Linux environment.
- Work with AWS services: S3 , Glue , RDS , Redshift .
- Implement data versioning using tools like Quilt , Git .
- Schedule jobs via Apache Airflow , Cron , Jenkins .
- Troubleshoot and optimize data pipelines for performance and reliability.
- Promote coding best practices and participate in peer reviews.
Technical Skill Requirements
ETL & Integration Tools
- Must Have: Talend (Open Studio / DI / Big Data)
- Good to Have: SSIS, SSRS, SAS
- Bonus: Apache NiFi, Informatica
Databases
- Required: PostgreSQL (3+ years)
- Bonus: Oracle, SQL Server, MySQL
Cloud Platforms
- Required: AWS (S3, Glue, RDS, Redshift)
- Bonus: Azure Data Factory, GCP
- Certifications: AWS / Azure (Good to Have)
Operating Systems & Scripting
- Required: Linux, Shell scripting
- Preferred: Python scripting
Data Versioning & Source Control
- Required: Quilt, Git (GitHub/Bitbucket)
- Bonus: DVC, LakeFS, Git LFS
Scheduling & Automation
- Tools: Apache Airflow, Cron, Jenkins, Talend Job Server
Other Tools (Bonus)
- REST APIs, JSON/XML, Spark, Hive, Hadoop
Visualization (Nice to Have)
- Power BI / Tableau
Soft Skills
- Strong verbal and written communication.
- Proven leadership and mentoring experience.
- Independent project execution skills.
- Quick learning ability and willingness to teach.
- Flexible to work in a hybrid setup from Chennai or Madurai .
Subject Matter Expert – ETL Developer
Posted 15 days ago
Job Viewed
Job Description
Job Title: Subject Matter Expert ETL Developer
Mode: Hybrid (3 Days Office + 2 Days Remote)
Job Type: Full-Time / Contract
Location:
- Washermenpet, Chennai
- Chinna Chokkikulam, Madurai
Shift Time: 1:00 PM to 10:00 PM IST
Experience Required: 5 to 9 Years
Joining: Immediate Joiners Preferred (Max 15 Days)
Payroll Company: Smiligence
Budget: Based on Experience
Holidays: As per US Calendar
Contact:
Job Overview
Smiligence is hiring a Senior ETL Developer with strong hands-on experience in Talend, PostgreSQL, AWS, and Linux . The ideal candidate will take complete ownership of data engineering projects, mentor team members, and drive best practices in ETL development and cloud data workflows.
Key Responsibilities
Core Functional Responsibilities
- Lead the design and development of scalable ETL workflows.
- Take ownership of project execution from requirement gathering to delivery.
- Conduct technical interviews and mentor junior developers.
- Create and test proof-of-concepts for data integration solutions.
- Assist in proposal preparation and client requirement analysis.
Technical Responsibilities
- Build ETL pipelines using Talend and PostgreSQL .
- Integrate structured and unstructured data from multiple sources.
- Develop scripts using Python or Shell in a Linux environment.
- Work with AWS services: S3 , Glue , RDS , Redshift .
- Implement data versioning using tools like Quilt , Git .
- Schedule jobs via Apache Airflow , Cron , Jenkins .
- Troubleshoot and optimize data pipelines for performance and reliability.
- Promote coding best practices and participate in peer reviews.
Technical Skill Requirements
ETL & Integration Tools
- Must Have: Talend (Open Studio / DI / Big Data)
- Good to Have: SSIS, SSRS, SAS
- Bonus: Apache NiFi, Informatica
Databases
- Required: PostgreSQL (3+ years)
- Bonus: Oracle, SQL Server, MySQL
Cloud Platforms
- Required: AWS (S3, Glue, RDS, Redshift)
- Bonus: Azure Data Factory, GCP
- Certifications: AWS / Azure (Good to Have)
Operating Systems & Scripting
- Required: Linux, Shell scripting
- Preferred: Python scripting
Data Versioning & Source Control
- Required: Quilt, Git (GitHub/Bitbucket)
- Bonus: DVC, LakeFS, Git LFS
Scheduling & Automation
- Tools: Apache Airflow, Cron, Jenkins, Talend Job Server
Other Tools (Bonus)
- REST APIs, JSON/XML, Spark, Hive, Hadoop
Visualization (Nice to Have)
- Power BI / Tableau
Soft Skills
- Strong verbal and written communication.
- Proven leadership and mentoring experience.
- Independent project execution skills.
- Quick learning ability and willingness to teach.
- Flexible to work in a hybrid setup from Chennai or Madurai .
Be The First To Know
About the latest Etl Jobs in Chennai !
Data Engineering
Posted 2 days ago
Job Viewed
Job Description
Responsibilities:
- Work with stakeholders to understand the data requirements to design, develop, and maintain complex ETL processes.
- Create the data integration and data diagram documentation.
- Lead the data validation, UAT and regression test for new data asset creation.
- Create and maintain data models, including schema design and optimization.
- Create and manage data pipelines that automate the flow of data, ensuring data quality and consistency.
Qualifications and Skills:
- Strong knowledge on Python and Pyspark
- Expectation is to have ability to write Pyspark scripts for developing data workflows.
- Strong knowledge on SQL, Hadoop, Hive, Azure, Databricks and Greenplum
- Expectation is to write SQL to query metadata and tables from different data management system such as, Oracle, Hive, Databricks and Greenplum.
- Familiarity with big data technologies like Hadoop, Spark, and distributed computing frameworks.
- Expectation is to use Hue and run Hive SQL queries, schedule Apache Oozie jobs to automate the data workflows.
- Good working experience of communicating with the stakeholders and collaborate effectively with the business team for data testing.
- Expectation is to have strong problem-solving and troubleshooting skills.
- Expectation is to establish comprehensive data quality test cases, procedures and implement automated data validation processes.
- Degree in Data Science, Statistics, Computer Science or other related fields or an equivalent combination of education and experience.
- 3-7 years of experience in Data Engineer.
- Proficiency in programming languages commonly used in data engineering, such as Python, Pyspark, SQL.
- Experience in Azure cloud computing platform, such as developing ETL processes using Azure Data Factory, big data processing and analytics with Azure Databricks.
- Strong communication, problem solving and analytical skills with the ability to do time management and multi-tasking with attention to detail and accuracy.
Talend ETL Developer (Talend Open Studio)
Posted 2 days ago
Job Viewed
Job Description
Position: Talend ETL Developer (Talend Open Studio | BigQuery | PostgreSQL | Python | GCP)
Experience: 3–7 Years
Locations : Chennai | Madurai | Coimbatore | Hybrid
Work Timings: 7:00 PM – 4:00 AM IST
About the Role
We are looking for an experienced Talend ETL Developer with strong expertise in Talend Open Studio , BigQuery , PostgreSQL , and Python to join our team. The ideal candidate will be responsible for designing, developing, and optimizing complex ETL pipelines and ensuring seamless integration across cloud data platforms.
Key Responsibilities
- Talend Data Integration (Talend Open Studio) – Design, build, and optimize complex ETL pipelines using Talend Open Studio and Talend Management Console (TMC).
- BigQuery – Work extensively on Google BigQuery for data modelling, partitioning, clustering, and advanced SQL queries (CTEs, arrays, and window functions)
- Python – Develop and automate ETL scripts using Python for data transformation and integration.
- PostgreSQL – Optimize database performance and manage stored procedures, triggers, and materialized views in PostgreSQL .
- Cloud (GCP) – Implement and manage workflows on Google Cloud Platform (GCP) , including Pub/Sub , Dataflow , Dataprep , and Cloud Storage .
- Version Control – Collaborate with DevOps teams to manage Git/GitHub version control, branching, and CI/CD pipelines using Jenkins
- Nice to Have:
- Experience with BigQuery ML (regression, classification, forecasting).
- Exposure to advanced performance optimization for BigQuery slots and reservations.
- Familiarity with offline-first data solutions and advanced integration strategies.
Data Engineering Consultant

Posted 1 day ago
Job Viewed
Job Description
**Primary Responsibilities:**
+ Responsible for managing the monthly data refreshes and custom process for the clients that includes extraction, loading and Data validation
+ Work closely with engineering, Implementation and downstream teams as the client data is refreshed, to answer questions and resolve data issues that arise
+ Investigate data anomalies to determine root cause, specify appropriate changes and work with engineering and downstream teams as the change is implemented and tested
+ Research client questions on data results by identifying underlying data elements leveraged and providing descriptions of data transformations involved
+ Participate in the ongoing invention, testing and use of tools used by the team to improve processes
+ Be innovative in finding opportunities to improve the process either through process improvement or automation
+ Partner with infrastructure team on migration activities and infrastructure changes related to the product or process
+ Leverage latest technologies and analyze large volumes of data to solve complex problems facing health care industry.
+ Build and improve standard operation procedures and troubleshooting documents
+ Report on metrics to surface meaningful results and identify areas for efficiency gain
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ 6+ years of experience working with data, analyzing data and understanding data
+ 6+ years of experience working with Relational database (SQL, Oracle)
+ 4+ years of experience working with Provider and Payer data
+ 2+ years of experience with AWS
+ Understanding of relational data bases and their principles of operation
+ Intermediate skills using Microsoft Excel and Microsoft Word
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
#NIC #NJP