16 Etl Processes jobs in Chennai

Data Engineering

Chennai, Tamil Nadu EXL

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Responsibilities:

  • Work with stakeholders to understand the data requirements to design, develop, and maintain complex ETL processes.
  • Create the data integration and data diagram documentation.
  • Lead the data validation, UAT and regression test for new data asset creation.
  • Create and maintain data models, including schema design and optimization.
  • Create and manage data pipelines that automate the flow of data, ensuring data quality and consistency.

Qualifications and Skills:

  • Strong knowledge on Python and Pyspark
  • Expectation is to have ability to write Pyspark scripts for developing data workflows.
  • Strong knowledge on SQL, Hadoop, Hive, Azure, Databricks and Greenplum
  • Expectation is to write SQL to query metadata and tables from different data management system such as, Oracle, Hive, Databricks and Greenplum.
  • Familiarity with big data technologies like Hadoop, Spark, and distributed computing frameworks.
  • Expectation is to use Hue and run Hive SQL queries, schedule Apache Oozie jobs to automate the data workflows.
  • Good working experience of communicating with the stakeholders and collaborate effectively with the business team for data testing.
  • Expectation is to have strong problem-solving and troubleshooting skills.
  • Expectation is to establish comprehensive data quality test cases, procedures and implement automated data validation processes.
  • Degree in Data Science, Statistics, Computer Science or other related fields or an equivalent combination of education and experience.
  • 3-7 years of experience in Data Engineer.
  • Proficiency in programming languages commonly used in data engineering, such as Python, Pyspark, SQL.
  • Experience in Azure cloud computing platform, such as developing ETL processes using Azure Data Factory, big data processing and analytics with Azure Databricks.
  • Strong communication, problem solving and analytical skills with the ability to do time management and multi-tasking with attention to detail and accuracy.
This advertiser has chosen not to accept applicants from your region.

Data Engineering Consultant

Chennai, Tamil Nadu UnitedHealth Group

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
**Primary Responsibilities:**
+ Responsible for managing the monthly data refreshes and custom process for the clients that includes extraction, loading and Data validation
+ Work closely with engineering, Implementation and downstream teams as the client data is refreshed, to answer questions and resolve data issues that arise
+ Investigate data anomalies to determine root cause, specify appropriate changes and work with engineering and downstream teams as the change is implemented and tested
+ Research client questions on data results by identifying underlying data elements leveraged and providing descriptions of data transformations involved
+ Participate in the ongoing invention, testing and use of tools used by the team to improve processes
+ Be innovative in finding opportunities to improve the process either through process improvement or automation
+ Partner with infrastructure team on migration activities and infrastructure changes related to the product or process
+ Leverage latest technologies and analyze large volumes of data to solve complex problems facing health care industry.
+ Build and improve standard operation procedures and troubleshooting documents
+ Report on metrics to surface meaningful results and identify areas for efficiency gain
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ 6+ years of experience working with data, analyzing data and understanding data
+ 6+ years of experience working with Relational database (SQL, Oracle)
+ 4+ years of experience working with Provider and Payer data
+ 2+ years of experience with AWS
+ Understanding of relational data bases and their principles of operation
+ Intermediate skills using Microsoft Excel and Microsoft Word
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
#NIC #NJP
This advertiser has chosen not to accept applicants from your region.

Director Data Engineering

Chennai, Tamil Nadu UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
We are seeking a visionary and technically adept Senior Data Engineering Leader to architect, scale, and optimize our data infrastructure. This role will drive the design and implementation of robust, cost-efficient, and observable data pipelines that power analytics, AI/ML, and operational systems across the enterprise. The ideal candidate will be a strategic thinker who can influence senior leadership and lead high-performing engineering teams.
**Primary Responsibilities:**
+ Data Pipeline Architecture: Design and implement scalable, high-performance data pipelines that support batch and real-time processing across diverse data domains
+ Total Cost of Ownership (TCO): Architect solutions with a focus on long-term sustainability, balancing performance, scalability, and cost efficiency
+ Operational Observability: Establish proactive monitoring, alerting, and logging frameworks to ensure system health, data quality, and SLA adherence
+ CI/CD & Automation: Champion automated testing, deployment, and release processes using modern DevOps practices. Ensure robust version control and rollback strategies
+ Blue-Green Deployments: Implement blue-green or canary deployment strategies to minimize downtime and risk during releases
+ Strategic Communication: Translate complex architectural decisions into business value. Confidently present and defend architectural choices to senior technology and business leaders
+ Leadership & Mentorship: Lead and mentor a team of data engineers, fostering a culture of innovation, accountability, and continuous improvement
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ Proven experience leading enterprise-scale data engineering initiatives
+ Hands-on experience with CI/CD pipelines, infrastructure-as-code (e.g., Terraform), and containerization (e.g., Docker, Kubernetes)
+ Experience with data governance, privacy, and compliance frameworks
+ Experience with AI/ML data pipelines and MLOps practices
+ Experience in healthcare, finance, or any other regulated industries
+ Deep expertise in data architecture, distributed systems, and cloud-native technologies (e.g., AWS, GCP, Azure)
+ Solid command of data modeling, ETL/ELT, orchestration tools (e.g., Airflow, dbt), and streaming platforms (e.g., Kafka)
+ Demonstrated success in implementing observability frameworks (e.g., Prometheus, Grafana, Datadog)
+ Proven excellent communication and stakeholder management skills
#Exetech
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineering

Chennai, Tamil Nadu Logitech

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Logitech is the Sweet Spot for people who want their actions to have a positive global impact while having the flexibility to do it in their own way.
**The Role:**
**We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) The Media and Marketing Data Enginee role will include participating in the loading and extraction of data, including external sources through API , Storage buckets( S3,Blob storage) and marketing specific data integrations. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment.**
**Your Contribution:**
**Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you'll need for success at Logitech. In this role you will:**
+ **Design, Develop, document, and test ETL solutions using industry standard tools.**
+ **Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting.**
+ **Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights.**
+ **Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers.**
+ **Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality.**
+ **Work closely across our D&I teams to deliver datasets optimized for consumption in reporting and visualization tools like Tableau**
+ **Collaborate with data architects, analysts, and business stakeholders to gather requirements and translate them into data solutions.**
+ **Participate in the design discussion with enterprise architects and recommend design improvements**
+ **Develop and maintain conceptual, logical, and physical data models with their corresponding metadata.**
+ **Work closely with cross-functional teams to integrate data solutions.**
+ **Create and maintain clear documentation for data processes, data models, and pipelines.**
+ **Integrate Snowflake with various data sources and third-party tools.**
+ **Manage code versioning and deployment of Snowflake objects using CI/CD practices**
**Key Qualifications:**
**For consideration, you must bring the following** **minimum** **skills and behaviors to our team:**
+ **A total of 6 to 10 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies.**
+ **At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools.**
+ **Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift.**
+ **Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, Snowflake AI/ML and stored procedures.**
+ **Worked on API based integrations and marketing data**
+ **Design and develop complex data pipelines and ETL workflows in Snowflake using advanced SQL, UDFs, UDTFs, and stored procedures (JavaScript/SQL).**
+ **Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake.**
+ **Experience** **in** **working** **with** **complex** **SQL Functions & Transformation of data on large data sets.**
+ **Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files.**
+ **Exposure to standard support ticket management tools.**
+ **A strong understanding of Business Intelligence and Data warehousing concepts and methodologies.**
+ **Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities.**
+ **A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software.**
+ **A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems.**
+ **Familiarity with Snowflake's unique features, such as its multi-cluster architecture and shareable data capabilities.**
+ **Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems.**
+ **The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability.**
+ **Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements.**
**In addition,** **preferable** **skills and behaviors include:**
+ **Exposure to Oracle ERP environment,**
+ **Basic understanding of Reporting tools like OBIEE, Tableau**
+ **Exposure to Marketing data platform like Adverity,Fivertran etc**
+ **Exposure to Customer data platform**
**Education:**
+ **BS/BTech/MCA/MS in computer science Information Systems or a related technical field or equivalent industry expertise.**
+ **Fluency in English.**
**Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we're small and flexible enough for every person to take initiative and make things happen. But we're big enough in our portfolio, and reach, for those actions to have a global impact. That's a pretty sweet spot to be in and we're always striving to keep it that way.**
**"** **_All qualified applicants will receive consideration for employment_** **_without regard to race, sex, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability."_**
Across Logitech we empower collaboration and foster play. We help teams collaborate/learn from anywhere, without compromising on productivity or continuity so it should be no surprise that most of our jobs are open to work from home from most locations. Our hybrid work model allows some employees to work remotely while others work on-premises. Within this structure, you may have teams or departments split between working remotely and working in-house.
Logitech is an amazing place to work because it is full of authentic people who are inclusive by nature as well as by design. Being a global company, we value our diversity and celebrate all our differences. Don't meet every single requirement? Not a problem. If you feel you are the right candidate for the opportunity, we strongly recommend that you apply. We want to meet you!
We offer comprehensive and competitive benefits packages and working environments that are designed to be flexible and help you to care for yourself and your loved ones, now and in the future. We believe that good health means more than getting medical care when you need it. Logitech supports a culture that encourages individuals to achieve good physical, financial, emotional, intellectual and social wellbeing so we all can create, achieve and enjoy more and support our families. We can't wait to tell you more about them being that there are too many to list here and they vary based on location.
All qualified applicants will receive consideration for employment without regard to race, sex, age, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability.
If you require an accommodation to complete any part of the application process, are limited in the ability, are unable to access or use this online application process and need an alternative method for applying, you may contact us toll free at for assistance and we will get back to you as soon as possible.
This advertiser has chosen not to accept applicants from your region.

Data Engineering Lead (Vice President)

Chennai, Tamil Nadu Citigroup

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities.
**Responsibilities:**
+ Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements
+ Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
+ Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
+ Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
+ Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals
+ Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions
+ Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary
+ Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
**Qualifications:**
+ 12+ years of relevant experience in Apps Development or systems analysis role
+ Extensive experience system analysis and in programming of software applications
+ Experience in managing and implementing successful projects
+ Subject Matter Expert (SME) in at least one area of Applications Development
+ Ability to adjust priorities quickly as circumstances dictate
+ Demonstrated leadership and project management skills
+ Consistently demonstrates clear and concise written and verbal communication
**Education:**
+ Bachelor's degree/University degree or equivalent experience
+ Master's degree preferred
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
**Required Skills (Essential)**
- Programming skills - including concurrent, parallel and distributed systems programming
- Expert level knowledge of **Java**
- Expert level experience with **HTTP, ReSTful web services** and **API design**
- Messaging technologies ( **Kafka** )
- Experience with **Reactive Streams**
**Desirable Skills:**
- Messaging technologies
- Experience with Bigdata technologies Developer **Hadoop , Apache Spark ,Python, PySpark**
- Familiarity with **hadoop SQL** interfaces like hive, spark sql, etc.
- Experience with **Kubernetes**
- Good understanding of the **Linux OS**
- Experience with Gradle, maven would be beneficial
---
**Job Family Group:**
Technology
---
**Job Family:**
Applications Development
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
This advertiser has chosen not to accept applicants from your region.

Senior Manager Software Engineering-Data Engineering

Chennai, Tamil Nadu Caterpillar, Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

**Career Area:**
Technology, Digital and Data
**Job Description:**
**Your Work Shapes the World at Caterpillar Inc.**
When you join Caterpillar, you're joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it.
We are seeking a **highly skilled and visionary Software Engineering Manager** to lead a team of engineers in building Caterpillar's next-generation **Digital Manufacturing Data Platform** . This platform is central to our Future of Manufacturing initiative-designed to unify and operationalize data across design, engineering, production, and supply chain operations.
The ideal candidate will possess deep expertise in **Big Data, Data Warehousing, real-time data movement** , and **Snowflake-based architectures** . You will architect and deliver scalable, secure, and intelligent data platforms that enable advanced analytics, AI, and digital twin capabilities across global manufacturing ecosystems.
**Key Responsibilities**
**Team Leadership & Management**
+ Lead, mentor, and manage a team of data engineers and platform developers.
+ Foster a culture of technical excellence, collaboration, and continuous learning.
+ Drive Agile practices and ensure timely delivery of high-quality solutions.
**Technical Strategy & Architecture**
+ Architect and oversee the development of scalable, secure, and resilient data platforms.
+ Design and implement near real-time data movement and streaming architectures using tools like Kafka, Spark, and cloud-native services.
+ Establish best practices in data modeling, ETL/ELT, data governance, and metadata management.
**Data Engineering & Snowflake Expertise**
+ Lead the development of robust data pipelines for ingestion, transformation, and delivery using Snowflake, dbt, and cloud-native tools.
+ Optimize data storage, retrieval, and processing for performance, reliability, and cost-efficiency.
+ Implement data quality frameworks, lineage tracking, and schema evolution strategies.
**Big Data & Data Warehousing**
+ Build and maintain large-scale data lakes and data warehouses for structured and unstructured data.
+ Design scalable data architectures to support manufacturing analytics, predictive maintenance, and supply chain optimization.
**Cloud & Platform Engineering**
+ Leverage Azure and AWS services for data ingestion, transformation, and analytics.
+ Deploy software using CI/CD tools (Azure DevOps preferred, Jenkins, AWS CloudFormation).
+ Ensure platform scalability, security, and operational readiness across global deployments.
**AI & Advanced Analytics Enablement**
+ Collaborate with Data Science and AI teams to operationalize ML models and analytics workflows.
+ Promote integration of AI capabilities into data engineering pipelines (e.g., GenAI, MCP, ATA).
+ Support real-time analytics and edge AI use cases in manufacturing environments.
**Stakeholder Engagement**
+ Partner with product managers, manufacturing SMEs, and business leaders to understand requirements and deliver impactful data solutions.
+ Communicate technical concepts to non-technical audiences and influence strategic decisions.
**Must-Have Skills**
+ Proven experience in Big Data processing and Data Warehousing.
+ Expertise in building end-to-end near real-time data pipelines for OLTP & OLAP.
+ Strong architecture exposure for building robust, scalable Data Platforms.
+ Deep expertise in Snowflake, SQL, NoSQL, and distributed data systems.
+ Experience with data transformation tools (dbt, Apache Spark, Azure Data Factory).
+ Strong analytical skills and solid knowledge of computer science fundamentals.
+ Deep exposure to Azure and AWS cloud platforms.
+ Good understanding of AI concepts and latest developments (Gen AI, MCP, ATA, etc.).
**Nice-to-Have Skills**
+ Knowledge of the NVIDIA ecosystem and its applications in data and AI.
+ Experience building production-ready AI solutions and integrating with MLOps workflows.
+ Familiarity with modern data visualization and BI tools (e.g., Power BI, Tableau, Looker).
**Qualifications**
+ Bachelor's or Master's degree in Computer Science, Engineering, or related field.
+ 15+ years of experience in data engineering, with at least 5+ years in a leadership role.
+ Demonstrated success in managing engineering teams and delivering complex data solutions.
+ Excellent communication, leadership, and stakeholder management skills.
Relocation is available for this position.
**Posting Dates:**
October 22, 2025 - October 30, 2025
Caterpillar is an Equal Opportunity Employer. Qualified applicants of any age are encouraged to apply
Not ready to apply? Join our Talent Community ( .
This advertiser has chosen not to accept applicants from your region.

ETL Developer

Chennai, Tamil Nadu Bright & Smart HR Solutions

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

An ETL (Extract, Transform, Load) Developer designs, develops, and maintains data pipelines and data warehouses, extracting data from various sources, transforming it into a usable format, and loading it into target systems. Key responsibilities include collaborating with business teams to understand data needs, writing complex SQL queries, troubleshooting data issues, optimizing performance, ensuring data accuracy, and documenting processes to support analytics and reporting.  


Core Responsibilities

  • Data Extraction:  Retrieving data from diverse sources like databases, applications, APIs, and files. 
  • Data Transformation:  Converting raw data into a standardized, clean, and consistent format suitable for the target system and business analysis. 
  • Data Loading:  Loading the transformed data into data warehouses or other target systems where it can be accessed for reporting and analytics. 
  • Designing and Developing ETL Processes:  Building and implementing the entire ETL pipeline, including data flows, mappings, and workflows. 
  • Data Quality and Integrity:  Ensuring the accuracy, consistency, and reliability of data throughout the ETL process. 
  • Performance Optimization:  Tuning SQL queries, identifying and resolving performance bottlenecks, and optimizing data loading times. 
  • Troubleshooting and Debugging:  Investigating and resolving any issues that arise within ETL processes or databases. 
  • Documentation:  Creating comprehensive documentation for ETL designs, processes, and architectures for future reference. 

Collaboration and Communication

  • Working with Stakeholders:  Collaborating with business analysts, data analysts, and other stakeholders to understand data requirements and business goals. 
  • Cross-functional Teams:  Working closely with technical and business teams to translate requirements into effective data solutions. 

System Maintenance and Improvement

  • Maintaining ETL Workflows:  Ensuring the ongoing health and performance of existing ETL jobs and processes. 
  • Implementing New Software:  Staying updated with new technologies and incorporating them to improve ETL capabilities and data processing. 
  • Providing Training:  Facilitating and training staff on ETL processes and best practices. 


This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl processes Jobs in Chennai !

ETL Developer

Chennai, Tamil Nadu Tag

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Position Summary:

We are seeking a highly skilled ETL Developer with 5–8 years of experience in data integration, transformation, and pipeline optimization. This role is a key part of our Data Engineering function within the Business Intelligence team, responsible for enabling robust data flows that power enterprise dashboards, analytics, and machine learning models. The ideal candidate has strong SQL and scripting skills, hands-on experience with cloud ETL tools, and a passion for building scalable data infrastructure.


Education Qualification:

  • B. Tech (CS, Elec), MCA or higher.


Key Responsibilities:

  • Design, develop, and maintain ETL pipelines that move and transform data across internal and external systems.
  • Collaborate with data analysts, BI developers, and data scientists to support reporting, modeling, and insight generation.
  • Build and optimize data models and data marts to support business KPIs and self-service BI.
  • Ensure data quality, lineage, and consistency across multiple source systems.
  • Monitor and tune performance of ETL workflows, troubleshoot bottlenecks and failures.
  • Support the migration of on-premise ETL workloads to cloud data platforms (e.g., Snowflake, Redshift, BigQuery).
  • Implement and enforce data governance, documentation, and operational best practices .
  • Work with DevOps/DataOps teams to implement CI/CD for data pipelines .


Required Qualifications:

  • 5–8 years of hands-on experience in ETL development or data engineering roles.
  • Advanced SQL skills and experience with data wrangling on large datasets.
  • Proficient with at least one ETL tool (e.g., Informatica , Talend , AWS Glue , SSIS , Apache Airflow , or Domo Magic ETL ).
  • Familiarity with data modeling techniques (star/snowflake schemas, dimensional models).
  • Experience working with cloud data platforms (e.g., AWS, Azure, GCP).
  • Strong understanding of data warehouse concepts , performance optimization, and data partitioning.
  • Experience with Python or scripting languages for data manipulation and automation.


Preferred Qualifications:

  • Exposure to BI platforms like Domo, Power BI, or Tableau.
  • Knowledge of CI/CD practices in a data engineering context (e.g., Git, Jenkins, dbt).
  • Experience working in Agile/Scrum environments .
  • Familiarity with data security and compliance standards (GDPR, HIPAA, etc.).
  • Experience with API integrations and external data ingestion.
This advertiser has chosen not to accept applicants from your region.

ETL Developer Analyst

Chennai, Tamil Nadu Citigroup

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Job Description
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for ETL development in AbInitio or Talend. The overall objective of this role is to design, develop, and optimize ETL workflows and data integration solutions using Ab Initio or Talend. The role involves working closely with business and technology teams to ensure seamless data processing and transformation.
**Responsibilities:**
+ Design, develop, and implement ETL (Extract, Transform, Load) pipelines using Ab Initio or Talend.
+ Work with structured, semi-structured, and unstructured data from multiple sources.
+ Optimize data processing and transformation workflows for efficiency and scalability.
+ Troubleshoot and resolve performance issues in ETL processes.
+ Collaborate with data architects, analysts, and business teams to define data requirements.
+ Ensure data quality, integrity, and governance standards are met.
+ Develop and maintain **m** etadata and documentation for ETL processes.
+ Implement and manage job scheduling and automation **t** ools.
**Preferred Qualifications:**
+ Certifications in Ab Initio, Talend, or cloud technologies are a plus.
+ Experience with CI/CD pipelines for ETL deployment.
**Qualifications:**
+ 4-6 years of relevant experience working with Talend, Ab Initio (GDE, Express>IT, Conduct>IT) or Talend (Data Fabric, Open Studio, etc.). Strong knowledge of SQL, PL/SQL, and database systems (Oracle, SQL Server, PostgreSQL, etc.).
+ Experience with ETL optimization, debugging, and performance tuning.
+ Experience in API integration, web services, and cloud platforms (AWS, Azure, GCP) is a plus. Strong understanding of data warehousing concepts and ETL best practices. Hands-on experience with version control tools (Git, SVN, etc.).
+ Strong analytical and problem-solving skills. Excellent communication and teamwork skills.
+ Consistently demonstrates clear and concise written and verbal communication
+ Demonstrated problem-solving and decision-making skills
+ Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements
**Education:**
+ Bachelor's degree/University degree or equivalent experience
---
**Job Family Group:**
Technology
---
**Job Family:**
Applications Development
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
This advertiser has chosen not to accept applicants from your region.

Subject Matter Expert – ETL Developer

Chennai, Tamil Nadu Smiligence

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

contract-to-hire,full-time,contract

Job Title: Subject Matter Expert ETL Developer 

Mode: Hybrid (3 Days Office + 2 Days Remote)

Job Type: Full-Time / Contract

Location:

  • Washermenpet, Chennai
  • Chinna Chokkikulam, Madurai

Shift Time: 1:00 PM to 10:00 PM IST

Experience Required: 5 to 9 Years

Joining: Immediate Joiners Preferred (Max 15 Days)

Payroll Company: Smiligence

Budget: Based on Experience

Holidays: As per US Calendar

Contact:


Job Overview

Smiligence is hiring a Senior ETL Developer with strong hands-on experience in Talend, PostgreSQL, AWS, and Linux . The ideal candidate will take complete ownership of data engineering projects, mentor team members, and drive best practices in ETL development and cloud data workflows.


Key Responsibilities

Core Functional Responsibilities

  • Lead the design and development of scalable ETL workflows.
  • Take ownership of project execution from requirement gathering to delivery.
  • Conduct technical interviews and mentor junior developers.
  • Create and test proof-of-concepts for data integration solutions.
  • Assist in proposal preparation and client requirement analysis.

Technical Responsibilities

  • Build ETL pipelines using Talend and PostgreSQL .
  • Integrate structured and unstructured data from multiple sources.
  • Develop scripts using Python or Shell in a Linux environment.
  • Work with AWS services: S3 , Glue , RDS , Redshift .
  • Implement data versioning using tools like Quilt , Git .
  • Schedule jobs via Apache Airflow , Cron , Jenkins .
  • Troubleshoot and optimize data pipelines for performance and reliability.
  • Promote coding best practices and participate in peer reviews.


Technical Skill Requirements

ETL & Integration Tools

  • Must Have: Talend (Open Studio / DI / Big Data)
  • Good to Have: SSIS, SSRS, SAS
  • Bonus: Apache NiFi, Informatica

Databases

  • Required: PostgreSQL (3+ years)
  • Bonus: Oracle, SQL Server, MySQL

Cloud Platforms

  • Required: AWS (S3, Glue, RDS, Redshift)
  • Bonus: Azure Data Factory, GCP
  • Certifications: AWS / Azure (Good to Have)

Operating Systems & Scripting

  • Required: Linux, Shell scripting
  • Preferred: Python scripting

Data Versioning & Source Control

  • Required: Quilt, Git (GitHub/Bitbucket)
  • Bonus: DVC, LakeFS, Git LFS

Scheduling & Automation

  • Tools: Apache Airflow, Cron, Jenkins, Talend Job Server

Other Tools (Bonus)

  • REST APIs, JSON/XML, Spark, Hive, Hadoop

Visualization (Nice to Have)

  • Power BI / Tableau


Soft Skills

  • Strong verbal and written communication.
  • Proven leadership and mentoring experience.
  • Independent project execution skills.
  • Quick learning ability and willingness to teach.
  • Flexible to work in a hybrid setup from Chennai or Madurai .
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Processes Jobs View All Jobs in Chennai