ETL Data Engineer
Posted today
Job Viewed
Job Description
Mandatory skills (8+ Years of experience in ETL development with 4+ Years on AWS
Pyspark scripting)
1. Experience deploying and running AWS-based data solutions using services or
products such as S3, Lambda, SNS, Cloud Step Functions.
2. Person should be strong in Pyspark
3. Hands on and working knowledge in Python packages like NumPy, Pandas, Etc
4. Experience deploying and running AWS-based data solutions using services or
products such as S3, Lambda, SNS, Cloud Step Functions. Sound knowledge in AWS
services is must.
5. Person should work as Individual contributor
6. Good to have familiar with metadata management, data lineage, and principles of
data governance.
Good to have:
1. Experience to process large set of data transformations both semi and structured
data
2. Experience to build data lake & configuration on delta tables.
3. Good experience with computing & cost optimization.
4. Understanding the environment and use case and ready to build holistic Data
Integration frame works.
5. Good experience in MWAA (airflow orchestration)
Soft skill:
1. Having good communication to interact with IT-Stake holders and Business.
2. Understand the pain point to delivery
Walkin - Data Engineer - ETL
Posted today
Job Viewed
Job Description
Location for all roles :
Chennai
Walk-In Tech Hiring Drive Chennai
Join the Engineering Team at Northern Arc Capital
Event Details
Saturday, 20th September 2025
9 :00 AM Floor E Block, IIT-Madras Research Park, Kanagam Village, Taramani, Chennai is F2F interview on 20th Sep ,9 :00AM onwards with 6-15 years of exp.
Candidate qualification : should be BE/B.Tech .
Job Objective
We are seeking a dynamic professional to fill in our Data Engineering role who will be responsible for designing, developing, and maintaining the data architecture, infrastructure, and pipelines that enable Northern Arc to collect, store, process, and analyze large volumes of data. This role will play a crucial part in ensuring data availability, quality, and accessibility for data-driven ACCOUNTABILITIES :
Data Pipeline Development
- Design, develop, and maintain data pipelines to ingest, process, and transform data from various sources into usable formats.
- Implement data integration solutions that connect disparate data systems, including databases, APIs, and third-party data sources.
- Data Storage and Warehousing :
- Create and manage data storage solutions, such as data lakes, data warehouses, and NoSQL databases.
- Optimize data storage for performance, scalability, and cost-efficiency.
Data Quality And Governance
- Establish data quality standards and implement data validation and cleansing processes.
- Collaborate with data analysts and data scientists to ensure data consistency and accuracy.
ETL (Extract, Transform, Load)
- Develop ETL processes to transform raw data into a structured and usable format.
- Monitor and troubleshoot ETL jobs to ensure data flows smoothly.
Data Security And Compliance
- Implement data security measures and access controls to protect sensitive data.
- Ensure compliance with data privacy regulations and industry standards (e.g., GDPR, HIPAA).
Performance Tuning
- Optimize data pipelines and queries for improved performance and efficiency.
- Identify and resolve bottlenecks in data processing.
Data Documentation
- Maintain comprehensive documentation for data pipelines, schemas, and data dictionaries.
- Create and update data lineage and metadata and Reliability :
- Design data infrastructure to scale with growing data volumes and business requirements.
- Implement data recovery and backup strategies to ensure data availability and :
- Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver data solutions.
- Provide technical support and guidance to data users.
Continuous Learning
- Stay updated on emerging technologies, tools, and best practices in data engineering.
- Implement new technologies to enhance data processing EXPERIENCE, & COMPETENCIES :
- 9+ years of experience in similar role
- Bachelor's or masters degree in computer science, Information Technology, or a related field.
- Proven experience in data engineering, ETL development, and data integration.
- Proficiency in data pipeline orchestration tools (e.g., Apache NiFi, Apache Airflow).
- Strong knowledge of databases (SQL and NoSQL), data warehousing, and data modeling concepts.
- Familiarity with data processing frameworks (e.g., Hadoop, Spark) and cloud-based data services (e.g., AWS, Azure, GCP).
- Experience with version control systems (e.g., Git) and data versioning.
- Excellent programming skills in languages such as Python, SQL, Java, Scala, R, and/or Go.
- Knowledge of data security and compliance standards.
- Strong problem-solving and troubleshooting skills.
- Effective communication and teamwork abilities.
)
Etl
Posted today
Job Viewed
Job Description
Position Summary:
The ETL Tester is responsible for validating and verifying data extraction, transformation, and loading processes to ensure data accuracy and integrity. This role involves designing and executing test cases, identifying data quality issues, and collaborating with data engineers to resolve discrepancies. The ETL Tester will also develop automated testing frameworks to enhance testing efficiency and coverage. A strong understanding of ETL processes and data warehousing concepts is essential for success in this role.
Minimum Qualification:
At least 6 years of IT experience with minimum 4 years of experience in below skills:
Work experience in ETL testing
Work experience in SAS and data analysis
Proficiency in SQL with ability to write complex queries in big data ecosystem.
Should possess AWS Cloud experience.
Extensive data analysis skills and knowledge in scripting to optimize data testing
Good to have knowledge in Healthcare Insurance.
Experience working in Agile team.
Should have excellent business communication to interact with Business teams and cross-functional teams.
**Responsibilities**:
Review requirements, specifications, and technical design documents to understand and derive test scenarios
Experience in developing and implementing test strategies for complex initiatives.
Design, develop and execute automation scripts.
Identify, record, document thoroughly and track bugs.
Strong problem-solving skills, troubleshooting, and root cause analysis skills.
Perform thorough regression testing when bugs are resolved.
Liaise with cross functional teams to understand system requirements.
**About Virtusa**
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 30,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Senior Data Engineer - ETL Lead
Posted today
Job Viewed
Job Description
• Experience in Informatica (Power center/Power Mart/Power Exchange).
• Experience in SQL Server with CUBE
• Extensively experience on ETL Informatica Transformations using Lookup, Filter, Expression, Router, Normalizer, Joiner, Update, Rank, Aggregator, Stored Procedure, Sorter, and Sequence Generator and created complex mappings.
• Experience in implementing the entire life cycle of the project
• Experience in Unix Shell Scripts for automation of the ETL process.
• Should be able to provide the end to end architecture of the ETL process for loading staging/landing zone and audit control process.
• Experience in creating detailed technical design documents, including the Source to target mapping docs.
• Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins, and hash indexes in Oracle/Teradata and Postgre SQL databases.
• Expertise in Data modeling techniques like Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
• Experienced in the Performance of Mapping Optimizations in Informatica.
ETL Developer
Posted 1 day ago
Job Viewed
Job Description
About Pinnacle Group:
Pinnacle Group is a leading workforce solutions provider that empowers organizations to achieve their business objectives through innovative workforce strategies. We specialize in optimizing contingent workforce management and delivering tailored solutions that drive operational excellence. With a commitment to diversity and inclusion, Pinnacle Group fosters an environment of collaboration, growth, and success.
Summary:
We are seeking an experienced ETL Developer to join our dynamic team. The ideal candidate will have a strong background in SQL, ETL processes, and Data Warehouse design and modeling. This role will be responsible for developing, maintaining, and optimizing ETL solutions using SSIS and Azure Data Factory to support our data integration and analytics initiatives.
Key Responsibilities
- ETL Development: Design, develop, and maintain ETL processes using SQL Server Integration Services (SSIS) and Azure Data Factory.
- Data Warehouse Design: Architect and implement data warehouse solutions, ensuring optimal design for data integration, storage, and retrieval.
- Data Modeling: Create and maintain data models, ensuring they meet business requirements and support analytical needs.
- Data Integration: Integrate data from various sources, ensuring data quality and consistency.
- Performance Optimization: Monitor ETL processes and data warehouse performance, implementing optimizations as needed.
- Collaboration: Work closely with business analysts, data scientists, and other stakeholders to understand data requirements and deliver effective solutions.
- Documentation: Maintain comprehensive documentation of ETL processes, data models, and data flows.
- Support: Provide support for data-related issues, troubleshooting and resolving problems as they arise.
Required Skills and Qualifications
Education: Bachelor’s degree in Computer Science, Information Technology, or a related field.
Experience:
- 3+ years of experience in ETL development.
- Proven experience with SQL Server Integration Services (SSIS) and Azure Data Factory.
- Experience in data warehouse design and modeling.
- Technical Skills:
- Proficient in SQL, including complex queries, stored procedures, and performance tuning.
- Strong understanding of ETL concepts, tools, and best practices.
- Experience with data modeling tools and methodologies.
- Familiarity with data integration from various sources (e.g., APIs, flat files, databases).
- Knowledge of cloud data warehousing solutions (e.g., Azure Synapse Analytics, Snowflake) is a plus.
- Analytical Skills: Excellent problem-solving and analytical skills, with a keen attention to detail.
- Communication: Strong verbal and written communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
- Teamwork: Ability to work effectively both independently and as part of a team.
Preferred Skills
- Experience with other ETL tools and platforms.
- Knowledge of data governance and data quality principles.
- Understanding of big data technologies (e.g., Hadoop, Spark) is a plus.
- Experience with BI tools (e.g., Power BI, Tableau) for data visualization and reporting.
Working Conditions
- Full-time position.
- Hybrid work environment.
This is going to be a onsite position in Chennai and the shift time in 6:30 PM - 3:30 AM.
ETL Developer
Posted 2 days ago
Job Viewed
Job Description
Position Summary:
We are seeking a highly skilled ETL Developer with 5–8 years of experience in data integration, transformation, and pipeline optimization. This role is a key part of our Data Engineering function within the Business Intelligence team, responsible for enabling robust data flows that power enterprise dashboards, analytics, and machine learning models. The ideal candidate has strong SQL and scripting skills, hands-on experience with cloud ETL tools, and a passion for building scalable data infrastructure.
Education Qualification:
- B. Tech (CS, Elec), MCA or higher.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines that move and transform data across internal and external systems.
- Collaborate with data analysts, BI developers, and data scientists to support reporting, modeling, and insight generation.
- Build and optimize data models and data marts to support business KPIs and self-service BI.
- Ensure data quality, lineage, and consistency across multiple source systems.
- Monitor and tune performance of ETL workflows, troubleshoot bottlenecks and failures.
- Support the migration of on-premise ETL workloads to cloud data platforms (e.g., Snowflake, Redshift, BigQuery).
- Implement and enforce data governance, documentation, and operational best practices .
- Work with DevOps/DataOps teams to implement CI/CD for data pipelines .
Required Qualifications:
- 5–8 years of hands-on experience in ETL development or data engineering roles.
- Advanced SQL skills and experience with data wrangling on large datasets.
- Proficient with at least one ETL tool (e.g., Informatica , Talend , AWS Glue , SSIS , Apache Airflow , or Domo Magic ETL ).
- Familiarity with data modeling techniques (star/snowflake schemas, dimensional models).
- Experience working with cloud data platforms (e.g., AWS, Azure, GCP).
- Strong understanding of data warehouse concepts , performance optimization, and data partitioning.
- Experience with Python or scripting languages for data manipulation and automation.
Preferred Qualifications:
- Exposure to BI platforms like Domo, Power BI, or Tableau.
- Knowledge of CI/CD practices in a data engineering context (e.g., Git, Jenkins, dbt).
- Experience working in Agile/Scrum environments .
- Familiarity with data security and compliance standards (GDPR, HIPAA, etc.).
- Experience with API integrations and external data ingestion.
ETL Developer
Posted today
Job Viewed
Job Description
Company Profile:
AXISCADES is a leading, end to end engineering solutions and product company. We bring expertise that caters to the digital, engineering, and smart manufacturing needs of large enterprises. With decades of experience in creating innovative, sustainable, and safer products worldwide, AXISCADES delivers business value across the entire engineering lifecycle.
Our deep domain expertise and engineering solution portfolio covers the complete product development lifecycle from concept evaluation to manufacturing support and certification for the Aerospace, Defence, Heavy Engineering, Automotive, Medical Devices & Industrial Product industries.
AXISCADES is headquartered in Bangalore and has offices across India, North America, Europe and the Asia Pacific region. URL:
Description/Comment: Skill Set:
• SQL
• Snowflake
• Snaplogic ETL Tool
Job Description:
• 6+ years of IT experience in Analysis, Design, Development and unit testing of Data
warehousing applications using industry accepted methodologies and procedures
• Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for
Business Intelligence reporting.
• Strong problem solving & technical skills coupled with confident decision making for enabling
effective solutions leading to high customer satisfaction
• Deliver robust solutions through Query optimization ensuring Data Quality.
• Should have experience in writing Functions and Stored Procedures.
• Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables,
star and snowflake schema modelling
• Analyse & translate functional specifications /user stories into technical specifications.
• Good to have experience in Design/ Development in any ETL tool like DataStage and Snaplogic.
• Good interpersonal skills, experience in handling communication and interactions between
Be The First To Know
About the latest Etl Jobs in Chennai !
ETL Developer
Posted today
Job Viewed
Job Description
Roles and Responsibilities:
- Design, develop, and maintain scalable ETL pipelines using PySpark and Python for large-scale data processing.
- Work closely with data architects and analysts to understand business requirements and translate them into technical solutions.
- Develop and optimize SQL queries for data extraction, transformation, and loading from various data sources.
- Integrate data from multiple sources (structured and unstructured) into data lakes or data warehouses on AWS.
- Manage data ingestion from real-time and batch sources, including Kafka, S3, and relational databases.
- Develop and deploy AWS Lambda functions for event-driven processing and automation tasks.
- Ensure data quality, integrity, and consistency throughout the ETL lifecycle.
- Monitor ETL job performance, troubleshoot failures, and optimize pipeline execution time.
- Implement error handling, logging, and alerting mechanisms to ensure pipeline reliability.
- Collaborate with DevOps teams to automate deployments using CI/CD tools.
- Maintain documentation for ETL processes, data models, and system architecture.
- Ensure adherence to data security and compliance standards across all ETL operations
Etl Developer
Posted today
Job Viewed
Job Description
Key Responsibilities:
- Design, develop, and optimize scalable and reliable ETL pipelines using Python and PySpark.
- Extract data from diverse data sources and transform it to meet analytical and business needs.
- Implement robust data validation, error handling, and quality checks within ETL pipelines.
- Work with large-scale datasets and ensure efficient performance and scalability.
- Collaborate with data engineers, analysts, and stakeholders to gather requirements and deliver end-to-end data solutions.
- Deploy and monitor ETL processes on AWS cloud services such as S3, Glue, Lambda, EMR, Redshift, Step Functions, etc.
- Ensure compliance with data governance and security standards.
- Troubleshoot and resolve performance bottlenecks and data quality issues.
Mandatory Qualifications:
- 4+ years of professional experience in ETL development.
- Strong programming skills in Python and experience with PySpark for distributed data processing.
- Proficient in SQL and working with relational and non-relational databases.
- Hands-on experience with AWS cloud services related to data engineering (e.g., S3, Glue, EMR, Lambda, Redshift).
- Experience in designing and implementing ETL workflows in a production environment.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration skills.
ETL Developer
Posted today
Job Viewed
Job Description
Company Description
iorta TechNXT specializes in delivering AI-powered digital transformation solutions tailored to the insurance and banking sectors. Founded in 2021, the company serves dynamic markets across the Asia-Pacific (APAC), Middle East, and Africa, including countries like Malaysia, Philippines, Thailand, Vietnam, Indonesia, Dubai, and Ethiopia. Trusted by leading financial organizations, iorta TechNXT provides customized, results-oriented software solutions. Our diverse product suite includes Salesdrive, Dot, Claimsdrive, and more, enabling businesses to streamline operations, elevate customer experiences, and drive long-term growth.
Role Description
This is a full-time role for an ETL Developer. The ETL Developer will be responsible for the design, development, and implementation of ETL processes. The role includes data extraction, transformation, and loading processes, using various ETL tools and techniques. The developer will collaborate with data analysts to support data integration and data modeling efforts to meet business requirements.
Key Responsibilities
• Design and implement ETL workflows using tools such as SSIS or equivalent.
• Develop and optimize complex SQL queries for data extraction, transformation, and loading.
• Work with relational and NoSQL databases including MySQL, SQL Server, and MongoDB.
• Collaborate with data analysts, engineers, and stakeholders to understand data requirements.
• Monitor ETL jobs for performance, troubleshoot failures, and implement fixes proactively.
• Ensure data integrity, quality, and consistency across all systems.
• Automate repetitive tasks and improve performance of data pipelines.
• Deploy ETL packages across environments and support data migration efforts.
• Maintain documentation for ETL processes and technical designs.
Qualifications
• Bachelor's/Master's degree in Computer Science, Information Technology, or a related field.
• 3 to 8 years of proven experience in ETL development and data integration.
• Strong understanding of data warehousing concepts, data lakes, and data modeling.
• Strong analytical and problem-solving skills with attention to detail.
• Ability to work independently and in a collaborative team environment.
• Excellent communication and documentation skills.
Technical Skills
• ETL Tools: SSIS (mandatory), Informatica or Talend (added advantage)
• Databases: SQL Server, MySQL, MongoDB
• Cloud Platforms: Experience with AWS services for data integration is a plus
• Programming Languages: Python (for data scripting and automation)
• Development Tools: Visual Studio Code
• Big Data Frameworks: Exposure to Hadoop, Spark is desirable
• Visualization: Power BI, Matplotlib (good to have)
• AI/ML Tools (Nice to Have): BERT, OpenCV
Nice to Have
• Experience working in insurance, banking, or financial services domain.
• Exposure to CI/CD pipelines and version control (e.g., Git).
• Familiarity with Agile methodologies.