201 Etl Engineer jobs in India

ETL Engineer

Bangalore, Karnataka Kyndryl

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

**Who We Are**
At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities.
**The Role**
Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips.
As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation.
In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation.
Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset-a true data alchemist.
Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made - and your lifecycle management expertise will ensure our data remains fresh and impactful.
So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth.
Your Future at Kyndryl
Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here.
**Who You Are**
Who You Are
You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others.
Required Skills and Experience:
+ **7 +years of experience** in **ETL development** using tools such as **Informatica, Talend, SSIS** , or equivalent platforms.
+ Strong proficiency in **SQL** and working with **relational databases** like **Oracle, SQL Server, and PostgreSQL** .
+ Expertise in **designing, developing, and maintaining ETL processes** to extract, transform, and load data from multiple sources.
+ Skilled in **Windows and Linux system administration** along with **scripting** (Shell, PowerShell, Python).
+ Hands-on experience in **optimizing ETL workflows** for performance, scalability, and reliability.
+ Proficient in **monitoring and troubleshooting ETL jobs** across both Windows and Linux environments.
+ Familiar with **cloud platforms** such as **AWS, Azure, and GCP** for data integration and deployment.
+ Experienced in **collaborating with cross-functional teams** to gather data requirements and deliver efficient solutions.
+ Strong focus on **data quality** , implementing validation rules and consistency checks across ETL processes.
+ Adept at **automating data integration tasks** using Python, Shell, or PowerShell scripts.
+ Capable of creating and maintaining **comprehensive documentation** for ETL processes, data flows, and system configurations.
+ Proven ability in **building and optimizing data pipelines** and automated workflows for large-scale systems.
+ Experienced in **troubleshooting and resolving data pipeline issues** and addressing performance bottlenecks.
Preferred Skills and Experience
-Experience working as a Data Engineer and/or in cloud modernization
-Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes
-Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization
-Cloud platform certification, e.g., AWS Certified Data Analytics - Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate
-Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio
-Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology
**Being You**
Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way.
**What You Can Expect**
With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed.
**Get Referred!**
If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address.
Kyndryl is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Kyndryl is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Senior ETL Engineer

Bangalore, Karnataka Kyndryl

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

**Who We Are**
At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities.
**The Role**
Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips.
As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation.
In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation.
Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset-a true data alchemist.
Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made - and your lifecycle management expertise will ensure our data remains fresh and impactful.
So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth.
Your Future at Kyndryl
Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here.
**Who You Are**
Who You Are
You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others.
Required Skills and Experience
+ **9 +years of experience** in **ETL development** using tools such as **Informatica, Talend, SSIS** , or equivalent platforms.
+ Strong proficiency in **SQL** and working with **relational databases** like **Oracle, SQL Server, and PostgreSQL** .
+ Expertise in **designing, developing, and maintaining ETL processes** to extract, transform, and load data from multiple sources.
+ Skilled in **Windows and Linux system administration** along with **scripting** (Shell, PowerShell, Python).
+ Hands-on experience in **optimizing ETL workflows** for performance, scalability, and reliability.
+ Proficient in **monitoring and troubleshooting ETL jobs** across both Windows and Linux environments.
+ Familiar with **cloud platforms** such as **AWS, Azure, and GCP** for data integration and deployment.
+ Experienced in **collaborating with cross-functional teams** to gather data requirements and deliver efficient solutions.
+ Strong focus on **data quality** , implementing validation rules and consistency checks across ETL processes.
+ Adept at **automating data integration tasks** using Python, Shell, or PowerShell scripts.
+ Capable of creating and maintaining **comprehensive documentation** for ETL processes, data flows, and system configurations.
+ Proven ability in **building and optimizing data pipelines** and automated workflows for large-scale systems.
+ Experienced in **troubleshooting and resolving data pipeline issues** and addressing performance bottlenecks.
Preferred Skills and Experience
-Experience working as a Data Engineer and/or in cloud modernization
-Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes
-Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization
-Cloud platform certification, e.g., AWS Certified Data Analytics - Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate
-Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio
-Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology
**Being You**
Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way.
**What You Can Expect**
With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed.
**Get Referred!**
If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address.
Kyndryl is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Kyndryl is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Azure-ETL Engineer

Pune, Maharashtra Coforge

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

We are seeking a seasoned Data Engineer with strong experience in Azure-based ETL pipelines and data architecture . The ideal candidate will be responsible for designing, implementing, and maintaining scalable data solutions that support business intelligence and analytics needs.


Key Responsibilities

  • Design, implement, and maintain data pipelines for ingestion, processing, and transformation using Azure technologies.
  • Collaborate with analysts and stakeholders to understand data requirements and develop efficient data workflows.
  • Develop and manage data storage solutions including Azure SQL Database , Azure Data Lake , and Azure Blob Storage .
  • Build and maintain ETL processes using Azure Data Factory or equivalent tools.
  • Ensure data quality through robust validation and cleansing procedures.
  • Optimize data pipelines for scalability, performance, and cost-efficiency.
  • Monitor and troubleshoot data pipeline issues to ensure data availability and consistency.


Mandatory Skills

  • Azure Data Factory (ADF)
  • Azure Databricks (ADB)
  • Snowflake
  • Data Migration
  • Data Modelling


Good to Have

  • Additional expertise in Snowflake


Share your resume over

This advertiser has chosen not to accept applicants from your region.

ETL Data Engineer

BPMLinks

Posted today

Job Viewed

Tap Again To Close

Job Description

Overview:


In the digital age, harnessing the power of data is crucial for business success. We offer a holistic approach that encompasses various aspects of data strategy, roadmap development, and data analytics implementation. Our seasoned professionals will collaborate with you to define an optimal data architecture while ensuring data governance and management practices that guarantee accuracy, reliability, and security.


Now, with the introduction of Generative AI, we empower you to take your data analytics capabilities to new heights. By leveraging advanced algorithms and machine learning, Generative AI enables the creation of valuable insights, predictions, and recommendations that were previously inaccessible. Seamlessly integrated with our Data Analytics Service, Generative AI opens up limitless possibilities for extracting actionable intelligence from your data.


Website:


Required Skills and Qualifications:

- Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field.

- Minimum 5 years of experience in ETL development.

- Proficiency in Snowflake, including design, development, and administration.

- Solid understanding of AWS services such as S3, Redshift, EC2, and Lambda.

- Strong SQL skills, with experience in complex query optimization and performance tuning.

- Experience with data modeling concepts and techniques.

- Proficiency in DBT (Data Build Tool) for data transformation and modeling.

- Extensive experience with version control systems, particularly GitHub, and proficiency in Git workflows and branching strategies.

- Solid understanding of data modeling principles, ETL processes, and data integration methodologies.

- Excellent problem-solving skills and attention to detail.

- Strong communication and collaboration skills, with the ability to work effectively in a team environment.

- Proven track record of delivering high-quality solutions on time and within budget.


Regards,

Jahanavi Kaudiki

This advertiser has chosen not to accept applicants from your region.

ETL Data Engineer

Delhi, Delhi The Techgalore

Posted 23 days ago

Job Viewed

Tap Again To Close

Job Description

remote

Pls rate the candidate (from 1 to 5, 1 lowest, 5 highest ) in these areas 

  1. Big Data
  2. PySpark
  3. AWS
  4. Redshift

Position Summary

Experienced ETL Developers and Data Engineers to ingest and analyze data from multiple enterprise sources into Adobe Experience Platform

 Requirements 

  • About 4-6 years of professional technology experience mostly focused on the following: 
  •  4+ year of experience on developing data ingestion pipelines using Pyspark(batch and streaming).
  • 4+ years experience on multiple Data engineering related services on AWS, e.g. Glue, Athena, DynamoDb, Kinesis, Kafka, Lambda, Redshift etc.
  •   1+ years of experience of working with Redshift esp the following.

o   Experience and knowledge of loading data from various sources, e.g. s3 bucket and on-prem data sources into Redshift.

o   Experience of optimizing data ingestion into Redshift.

o   Experience of designing, developing and optimizing queries on Redshift using SQL or PySparkSQL

o   Experience of designing tables in Redshift(distribution key, compression etc., vacuuming,etc. ) 

  Experience of developing applications that consume the services exposed as ReST APIs.   Experience and ability to write and analyze complex and performant SQLs

Special Consideration given for  

  • 2 years of Developing and supporting ETL pipelines using enterprise-grade ETL tools like Pentaho, Informatica, Talend
  • Good knowledge on Data Modellin g(design patterns and best practices)
  •   Experience with Reporting Technologies (i.e. Tableau, PowerBI)

What youll do

  Analyze and understand customers use case and data sources and extract, transform and load data from multitude of customers enterprise sources and ingest into Adobe Experience Platform

  Design and build data ingestion pipelines into the platform using PySpark

  Ensure ingestion is designed and implemented in a performant manner to support the throughout and latency needed.

  Develop and test complex SQLs to extractanalyze and report the data ingested into the Adobe Experience platform.

  Ensure the SQLs are implemented in compliance with the best practice to they are performant.

  Migrate platform configurations, including the data ingestion pipelines and SQL, across various sandboxes.

  Debug any issues reported on data ingestion, SQL or any other functionalities of the platform and resolve the issues.

  Support Data Architects in implementing data model in the platform.

  Contribute to the innovation charter and develop intellectual property for the organization.

  Present on advanced features and complex use case implementations at multiple forums.  

  Attend regular scrum events or equivalent and provide update on the deliverables.

  Work independently across multiple engagements with none or minimum supervision.



This advertiser has chosen not to accept applicants from your region.

Data Engineering

Chennai, Tamil Nadu EXL

Posted today

Job Viewed

Tap Again To Close

Job Description

Responsibilities:

  • Work with stakeholders to understand the data requirements to design, develop, and maintain complex ETL processes.
  • Create the data integration and data diagram documentation.
  • Lead the data validation, UAT and regression test for new data asset creation.
  • Create and maintain data models, including schema design and optimization.
  • Create and manage data pipelines that automate the flow of data, ensuring data quality and consistency.

Qualifications and Skills:

  • Strong knowledge on Python and Pyspark
  • Expectation is to have ability to write Pyspark scripts for developing data workflows.
  • Strong knowledge on SQL, Hadoop, Hive, Azure, Databricks and Greenplum
  • Expectation is to write SQL to query metadata and tables from different data management system such as, Oracle, Hive, Databricks and Greenplum.
  • Familiarity with big data technologies like Hadoop, Spark, and distributed computing frameworks.
  • Expectation is to use Hue and run Hive SQL queries, schedule Apache Oozie jobs to automate the data workflows.
  • Good working experience of communicating with the stakeholders and collaborate effectively with the business team for data testing.
  • Expectation is to have strong problem-solving and troubleshooting skills.
  • Expectation is to establish comprehensive data quality test cases, procedures and implement automated data validation processes.
  • Degree in Data Science, Statistics, Computer Science or other related fields or an equivalent combination of education and experience.
  • 3-7 years of experience in Data Engineer.
  • Proficiency in programming languages commonly used in data engineering, such as Python, Pyspark, SQL.
  • Experience in Azure cloud computing platform, such as developing ETL processes using Azure Data Factory, big data processing and analytics with Azure Databricks.
  • Strong communication, problem solving and analytical skills with the ability to do time management and multi-tasking with attention to detail and accuracy.
This advertiser has chosen not to accept applicants from your region.

PySpark & ETL Data Engineer

Hyderabad, Andhra Pradesh CirrusLabs

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

We are CirrusLabs . Our vision is to become the world's most sought-after niche digital transformation company that helps customers realize value through innovation. Our mission is to co-create success with our customers, partners and community. Our goal is to enable employees to dream, grow and make things happen. We are committed to excellence. We are a dependable partner organization that delivers on commitments. We strive to maintain integrity with our employees and customers. Every action we take is driven by value. The core of who we are is through our well-knit teams and employees. You are the core of a values driven organization.


You have an entrepreneurial spirit. You enjoy working as a part of well-knit teams. You value the team over the individual. You welcome diversity at work and within the greater community. You aren't afraid to take risks. You appreciate a growth path with your leadership team that journeys how you can grow inside and outside of the organization. You thrive upon continuing education programs that your company sponsors to strengthen your skills and for you to become a thought leader ahead of the industry curve.


You are excited about creating change because your skills can help the greater good of every customer, industry and community. We are hiring a talented Pyspark to join our team. If you're excited to be part of a winning team, CirrusLabs ( ) is a great place to grow your career.


Experience - 4-8 years

Location - Hyderabad/ Bengaluru


About the Role

CirrusLabs is seeking a skilled and experienced PySpark Data Engineer (ETL Lead) to join our growing data engineering team. As an ETL Lead, you will play a pivotal role in designing, developing, and maintaining robust data integration pipelines using PySpark and related technologies. You’ll work closely with data architects, analysts, and stakeholders to transform raw data into high-quality, actionable insights, enabling data-driven decision-making across the organization.

This is an exciting opportunity for someone who is not only technically strong in PySpark and Python but also capable of leading data integration efforts for complex projects.


Key Responsibilities

  • Lead Data Integration Projects:
  • Manage the data integration and ETL activities for enterprise-level data projects.
  • Gather requirements from stakeholders and translate them into technical solutions.
  • Develop PySpark Pipelines:
  • Design and develop scalable and efficient PySpark scripts , both generic frameworks and custom solutions tailored to specific project requirements.
  • Implement end-to-end ETL processes to ingest, clean, transform, and load data.
  • Schedule and Automate ETL Processes:
  • Create scheduling processes to manage and run PySpark jobs reliably and efficiently.
  • Integrate ETL workflows into automation tools and CI/CD pipelines.
  • Optimize Data Processing:
  • Optimize PySpark jobs for performance and resource efficiency.
  • Monitor, troubleshoot, and resolve issues related to data processing and pipeline execution.
  • Data Transformation and Curation:
  • Transform raw data into consumable, curated data models suitable for reporting and analytics.
  • Ensure data quality, consistency, and reliability throughout all stages of the ETL process.
  • Collaboration and Best Practices:
  • Collaborate with data architects, analysts, and business stakeholders to define requirements and deliver solutions.
  • Contribute to the evolution of data engineering practices, frameworks, and standards.
  • Provide guidance and mentorship to junior engineers on PySpark and ETL best practices.
  • Documentation:
  • Develop and maintain technical documentation related to ETL processes, data flows, and solutions.


Required Skills and Qualifications

  • Experience:
  • 5–8 years of professional experience in data engineering, ETL development, or related fields.
  • Proven experience leading data integration projects from design to deployment.
  • Technical Skills:
  • Strong hands-on experience with PySpark for building large-scale data pipelines.
  • Proficiency in Python , including writing efficient, reusable, and modular code.
  • Solid knowledge of SQL for data extraction, transformation, and analysis.
  • Strong understanding of Spark architecture, including execution plans, partitions, memory management, and optimization techniques.
  • Data Engineering Expertise:
  • Experience working on data integration projects , such as data warehousing, data lakes, or analytics solutions.
  • Familiarity with processing structured and semi-structured data formats (e.g., Parquet, Avro, JSON, CSV).
  • Ability to transform and harmonize data from raw to curated layers.


Additional Skills:

  • Familiarity with data pipeline orchestration tools (e.g., Airflow, Azkaban) is a plus.
  • Experience with cloud platforms (e.g., AWS, Azure, GCP) is desirable.
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration skills.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl engineer Jobs in India !

QA ETL Test Engineer

Bhubaneswar, Orissa CSM Technologies

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview:

We are seeking a highly skilled and motivated Senior QA Engineer (ETL/Data) with strong expertise in ETL Testing, data governance, and cloud data platforms to drive QA delivery for large scale data initiatives. The ideal candidate will combine hands-on ETL/data testing with leadership, stakeholder management, and delivery oversight, ensuring compliance with governance standards while validating data accuracy across AWS and Snowflake platforms.

Key Responsibilities:

  • Work across ETL, data warehouse, Snowflake, and AWS-based data projects ensuring high data accuracy and reliability.
  • Define and implement test strategies, governance frameworks, and AQ best practices for ETL and Data Testing
  • Conduct data profiling, validation, reconciliation, and transformation testing using SQL and automation where applicable.
  • Collaborate with business stakeholder’s product owners via (RALLY) and data governance teams to ensure requirements and acceptance criteria are fully met.
  • Drive compliance with data governance, data quality, and regulatory standards
  • Manage stakeholder expectations by providing test plans, dashboards, and progress updates.
  • Partner with DevOps and Automation teams to enhance ETL and data testing automation in AWS/Snowflake environments.


Required Skills & Experience

  • 2 - 5 years of QA experience
  • Strong expertise in ETL testing, data profiling, and data warehouse/migration testing.
  • Proficiency in SQL with ability to write advanced queries for large-scale data validation.
  • Hands-on experience with Snowflake and AWS cloud data services (S3, Redshift, Glue, Lambda, etc.).
  • Working knowledge of ETL tools (Informatica, Talend, DataStage, SSIS, or similar).
  • Experience with Rally/Agile Central for Agile project management and QA delivery tracking.
  • Solid understanding of data governance, metadata management, and data quality frameworks.
  • Familiarity with test management tools (JIRA, HP ALM, TestRail).


Good to Have: -

  • Exposure to automation frameworks for ETL testing (Python-based or custom DB validation frameworks).
  • Knowledge of BI and reporting tools (Tableau, Power BI, Qlik).
  • Experience with data lineage, data cataloging tools, and master data management (MDM).
  • . Certifications in AWS, Snowflake, or ISTQB/Data Testing are a plus.
This advertiser has chosen not to accept applicants from your region.

Data Engineering Manager

Hyderabad, Andhra Pradesh Amgen

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Join Amgen's Mission of Serving Patients
At Amgen, if you feel like you're part of something bigger, it's because you are. Our shared mission-to serve patients living with serious illnesses-drives all that we do.
Since 1980, we've helped pioneer the world of biotech in our fight against the world's toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you'll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives.
Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you'll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career.
**Data Engineering Manager**
**What you will do**
Let's do this. Let's change the world. In this vital role you will lead a team of data engineers to build, optimize, and maintain scalable data architectures, data pipelines, and operational frameworks that support real-time analytics, AI-driven insights, and enterprise-wide data solutions. As a strategic leader, the ideal candidate will drive best practices in data engineering, cloud technologies, and Agile development, ensuring robust governance, data quality, and efficiency. The role requires technical expertise, team leadership, and a deep understanding of cloud data solutions to optimize data-driven decision-making.
+ Lead and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous learning for solving complex problems of R&D division.
+ Oversee the development of data extraction, validation, and transformation techniques, ensuring ingested data is of high quality and compatible with downstream systems.
+ Guide the team in writing and validating high-quality code for data ingestion, processing, and transformation, ensuring resiliency and fault tolerance.
+ Drive the development of data tools and frameworks for running and accessing data efficiently across the organization.
+ Oversee the implementation of performance monitoring protocols across data pipelines, ensuring real-time visibility, alerts, and automated recovery mechanisms.
+ Coach engineers in building dashboards and aggregations to monitor pipeline health and detect inefficiencies, ensuring optimal performance and cost-effectiveness.
+ Lead the implementation of self-healing solutions, reducing failure points and improving pipeline stability and efficiency across multiple product features.
+ Oversee data governance strategies, ensuring compliance with security policies, regulations, and data accessibility best practices.
+ Guide engineers in data modeling, metadata management, and access control, ensuring structured data handling across various business use cases.
+ Collaborate with business leaders, product owners, and cross-functional teams to ensure alignment of data architecture with product requirements and business objectives.
+ Prepare team members for key partner discussions by helping assess data costs, access requirements, dependencies, and availability for business scenarios.
+ Drive Agile and Scaled Agile (SAFe) methodologies, handling sprint backlogs, prioritization, and iterative improvements to enhance team velocity and project delivery.
+ Stay up-to-date with emerging data technologies, industry trends, and best practices, ensuring the organization uses the latest innovations in data engineering and architecture.
**What we expect of you**
We are all different, yet we all use our unique contributions to serve patients. We are seeking a seasoned Engineering Manager (Data Engineering) to drive the development and implementation of our data strategy with deep expertise in R&D of Biotech or Pharma domain.
**Basic Qualifications:**
+ Doctorate degree **OR**
+ Master's degree and 4 to 6 years of experience in Computer Science, IT or related field **OR**
+ Bachelor's degree and 6 to 8 years of experience in Computer Science, IT or related field **OR**
+ Diploma and 10 to 12 years of experience in Computer Science, IT or related field
+ Experience leading a team of data engineers in the R&D domain of biotech/pharma companies.
+ Experience architecting and building data and analytics solutions that extract, transform, and load data from multiple source systems.
+ Data Engineering experience in R&D for Biotechnology or pharma industry
+ Demonstrated hands-on experience with cloud platforms (AWS) and the ability to architect cost-effective and scalable data solutions.
+ Proficiency in Python, PySpark, SQL.
+ Experience with dimensional data modeling.
+ Experience working with Apache Spark, Apache Airflow.
+ Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops.
+ Experienced with AWS or GCP or Azure cloud services.
+ Understanding of end to end project/product life cycle
+ Well versed with full stack development & DataOps automation, logging frameworks, and pipeline orchestration tools.
+ Strong analytical and problem-solving skills to address complex data challenges.
+ Effective communication and interpersonal skills to collaborate with cross-functional teams.
**Preferred Qualifications:**
+ AWS Certified Data Engineer preferred
+ Databricks Certificate preferred
+ Scaled Agile SAFe certification preferred
+ Project Management certifications preferred
+ Data Engineering Management experience in Biotech/Pharma is a plus
+ Experience using graph databases such as Stardog or Marklogic or Neo4J or Allegrograph, etc.
**Soft Skills:**
+ Excellent analytical and troubleshooting skills
+ Strong verbal and written communication skills
+ Ability to work effectively with global, virtual teams
+ High degree of initiative and self-motivation
+ Ability to handle multiple priorities successfully
+ Team-oriented, with a focus on achieving team goals
+ Strong presentation and public speaking skills
**What you can expect of us**
As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we'll support your journey every step of the way.
In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
**Apply now and make a lasting impact with the Amgen team.**
**careers.amgen.com**
As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease.
Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
This advertiser has chosen not to accept applicants from your region.

Data Engineering manager

Pune, Maharashtra Panasonic Avionics Corporation

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

**Overview**
Who We Are: 
Ever wonder who brings the entertainment to your flights? Panasonic Avionics Corporation is #1 in the industry for delivering inflight products such as movies, games, WiFi, and now Bluetooth headphone connectivity!  
How exciting would it be to be a part of the innovation that goes into creating technology that delights millions of people in an industry that's here to stay! With our company's history spanning over 40 years, you will have stability, career growth opportunities, and will work with the brightest minds in the industry. And we are committed to a diverse and inclusive culture that will help our organization thrive! We seek diversity in many areas such as background, culture, gender, ways of thinking, skills and more. 
If you want to learn more about us visit us at ( . And for a full listing of open job opportunities go to (   
**The Position:**
We are seeking a proven Data Engineering Leader to drive the design, development, and deployment of scalable, secure, and high-performance data solutions. This role will lead high-performing teams, architect cloud-native data platforms, and collaborate closely with business, AI/ML, and BI teams to deliver end-to-end data products that power innovation and strategic decision-making.
The position offers the opportunity to shape data architecture strategy, establish best practices in Lakehouse and streaming solutions, and enable advanced analytics and AI/ML at scale.
**Responsibilities**
**What We're Looking For:**
+ Proven leadership in building and mentoring high-performing data engineering teams.
+ Expertise in architecting cloud-native data platforms on AWS, leveraging services such as EMR, EKS, Glue, Redshift, S3, Lambda, and SageMaker.
+ Strong background in Lakehouse architecture (Glue Catalog, Iceberg, Delta Lake) and distributed processing frameworks (Spark, Hive, Presto).
+ Experience with real-time streaming solutions (Kafka, Kinesis, Flink).
+ Proficiency in orchestrating complex data workflows with Apache Airflow.
+ Hands-on experience with GitLab CI/CD, Terraform, CloudFormation Templates, and Infrastructure-as-Code.
+ Strong understanding of MDM strategies and data governance best practices (GDPR, HIPAA, etc.).
+ Ability to design and develop middleware APIs (REST) to seamlessly integrate data pipelines with applications and analytics platforms.
+ Experience supporting AI/ML teams with feature engineering, training, and deployment pipelines using SageMaker.
+ Solid knowledge of SQL & NoSQL databases (Redshift, DynamoDB, PostgreSQL, Elasticsearch).
+ Familiarity with BI enablement and data modeling for visualization platforms like Amazon QuickSight.
+ In-depth knowledge of security best practices in AWS-based data architectures.
+ Demonstrated success in driving AI/ML initiatives from ideation to production.
**Our Principles:** ** **
Contribution to Society | Fairness & Honesty | Cooperation & Team Spirit | Untiring Effort for Improvement | Courtesy & Humility | Adaptability | Gratitude 
**What We Offer:** ** **
At Panasonic Avionics Corporation we realize the most important aspects in leading our industry are the bright minds behind everything we do. We are proud to offer our employees a highly competitive, comprehensive and flexible benefits program. 
**Qualifications**
**Educational Background:**
+ Bachelor's degree or higher in Computer Science, Data Engineering, Aerospace Engineering, or a related field.
+ Advanced degrees (Master's/PhD) in Data Science or AI/ML are a plus.
REQ-
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Engineer Jobs