101 Etl Processes jobs in Mumbai

Data Engineering Manager

Mumbai, Maharashtra UnitedHealth Group

Posted 15 days ago

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
We are looking for a skilled Data Engineer to design, build, and maintain scalable, secure, and high-performance data solutions. This role spans the full data engineering lifecycle - from research and architecture to deployment and support- within cloud-native environments, with a strong focus on AWS and Kubernetes (EKS).
**Primary Responsibilities:**
+ **Data Engineering Lifecycle:** Lead research, proof of concept, architecture, development, testing, deployment, and ongoing maintenance of data solutions
+ **Data Solutions:** Design and implement modular, flexible, secure, and reliable data systems that scale with business needs
+ **Instrumentation and Monitoring:** Integrate pipeline observability to detect and resolve issues proactively
+ **Troubleshooting and Optimization:** Develop tools and processes to debug, optimize, and maintain production systems
+ **Tech Debt Reduction:** Identify and address legacy inefficiencies to improve performance and maintainability
+ **Debugging and Troubleshooting:** Quickly diagnose and resolve unknown issues across complex systems
+ **Documentation and Governance:** Maintain clear documentation of data models, transformations, and pipelines to ensure security and governance compliance
+ **Cloud Expertise:** Leverage advanced skills in AWS and EKS to build, deploy, and scale cloud-native data platforms
+ **Cross-Functional Support:** Collaborate with analytics, application development, and business teams to enable data-driven solutions
+ **Team Leadership:** Lead and mentor engineering teams to ensure operational efficiency and innovation
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Bachelor's degree in Computer Science or related field
+ 5+ years of experience in data engineering or related roles
+ Proven experience designing and deploying scalable, secure, high-quality data solutions
+ Solid expertise in full Data Engineering lifecycle (research to maintenance)
+ Advanced AWS and EKS knowledge
+ Proficient in CI/CD, IaC, and addressing tech debt
+ Proven skilled in monitoring and instrumentation of data pipelines
+ Proven advanced troubleshooting and performance optimization abilities
+ Proven ownership mindset with ability to manage multiple components
+ Proven effective cross-functional collaborator (DS, SMEs, and external teams).
+ Proven exceptional debugging and problem-solving skills
+ Proven solid individual contributor with a team-first approach
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
_#njp_
This advertiser has chosen not to accept applicants from your region.

Data Engineering Lead

Thane, Maharashtra DMart - Avenue Supermarts Ltd

Posted 15 days ago

Job Viewed

Tap Again To Close

Job Description

About the organization:

DMart is one of India’s leading retail chains, serving millions of customers across 425+ stores and e-commerce channels throughout India. Our core objective is to offer customers good products at great value. We focus on everyday low pricing, seamless shopping experiences, and data-driven decision making. As we scale our data analytics journey, we’re seeking a Data Engineering Lead to drive our next phase of growth.


Key Responsibilities:

  • Technical Leadership: Lead architecture design for multiple enterprise data initiatives across cloud, on-premises, and hybrid platforms.
  • Data Platform Architecture and Implementation: Define and document multi-layered architectures (Raw, Curated, Analytics) across data warehouses, lake houses, and operational data stores, with an emphasis on cloud-based solutions.
  • Production Management: Directly run daily processes for efficient and reliable delivery of all programs end to end for data sync of the data lake – including oversight of change management for enrichment as per CRs.
  • Platform Expertise: Apply deep knowledge of architecture patterns and principles, performance tuning, and best practices where Snowflake is part of the solution stack.
  • Schema & Data Modelling: Develop conceptual, logical, and physical data models using consistent standards, including dimensional, normalized, and denormalized structures for analytical and operational use cases.
  • Best Practices & Optimization: Provide guidance on data partitioning, indexing, access controls, naming conventions, and performance optimization across multiple platforms.
  • Metadata & Data Lineage: Collaborate with governance and stewardship teams to integrate metadata management, lineage mapping, and catalog tools (Snowflake Catalog or equivalents).
  • Collaboration: Work closely with analysts, governance leads, engineering teams, and application architects to align models with domain requirements and enterprise architecture standards.
  • Documentation: Maintain architecture blueprints and technical diagrams using tools such as Draw.io or equivalent.
  • Mentorship: Guide and mentor engineers, modelers, and other technical resources on architectural best practices.


Requisites:

  • Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience: 12+ years of IT experience with a strong background in data engineering, modeling, and architecture. 5+ years of experience designing and implementing enterprise data platforms (cloud/on-prem), including at least 3 years with Snowflake.
  • Proficiency in SQL, schema design, and data modeling tools. Certifications in Snowflake incl data modelling, data extracts/ingestion will be given preference.
  • Strong knowledge of ELT/ETL patterns, data integration, and orchestration tools (e.g., Airflow).
  • Integration Experience with ETL and ELT tools and applications incl SAP ABAP Experience with streaming data platforms and OData. Experience with CDC patterns.
  • Experience designing data layers (Raw, Curated, Analytics) with governance and scalability in mind.
  • Familiarity with security and compliance frameworks, including RBAC and regulatory requirements.
  • Proven track record in project delivery, including performance optimization and handling large-scale dataset.
  • Experience with SAP R3 or SAP BW and Google BigQuery.
  • Certifications in cloud platforms or enterprise architecture (e.g., TOGAF, DAMA, SnowPro, Azure, GCP).
  • Experience integrating data platforms with BI tools (e.g., Power BI, Tableau).
  • Knowledge of MDM, data mesh concepts, and even
This advertiser has chosen not to accept applicants from your region.

Data Engineering Director

Mumbai, Maharashtra eClerx

Posted 17 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Senior Data Engineering Lead (Databricks)


Company Overview:

At eClerx, we are a leading IT firm specializing in innovative technologies and solutions that drive business transformation. Leveraging expertise in business process management, advanced analytics, and smart automation, we empower our clients to achieve operational excellence and competitive advantage in fast-evolving markets.


Role Overview:

We are seeking a highly experienced Senior Data Engineering Lead with a strong focus on Databricks and cloud-based data engineering to lead our data engineering team. This leadership role requires a visionary who can design, develop, and manage scalable data infrastructure and pipelines, while mentoring and inspiring a team of data engineers. You will work closely with cross-functional teams including data scientists, analysts, and software engineers to enable robust data-driven decision-making and support business goals.


Key Responsibilities:

  • Lead and manage a team of data engineers, providing mentorship, technical guidance, and fostering a culture of collaboration and innovation.
  • Architect, design, and oversee implementation of large-scale data pipelines, data lakes, and cloud-based data warehouses using Databricks, Apache Spark, and Snowflake.
  • Develop and optimize ETL/ELT workflows ensuring performance, reliability, and scalability of data infrastructure.
  • Collaborate with business stakeholders, data scientists, and software teams to understand requirements and translate them into scalable, efficient data solutions.
  • Implement best practices for data quality, governance, security, and compliance.
  • Drive continuous improvement of data engineering processes, standards, and tools across the organization.
  • Support presales activities by contributing to RFPs, technical proposals, and client engagements.
  • Stay abreast of emerging data technologies and trends, recommending innovative solutions to enhance analytics capabilities.
  • Manage resource planning, project prioritization, and delivery timelines ensuring alignment with business objectives.
  • Lead performance reviews, identify skill gaps, and champion professional development within the data engineering team.
  • Facilitate cross-team communication to streamline data workflows and improve overall delivery.


Qualifications & Skills:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related discipline.
  • Minimum 15 years of professional experience in data engineering with at least 9 years in leadership or senior technical roles.
  • Deep hands-on expertise with Databricks and Apache Spark for large-scale data processing.
  • Strong programming skills in Python, Scala, or Java.
  • Extensive experience with cloud data platforms such as AWS, Azure, or GCP, including services like S3, Redshift, BigQuery, Snowflake.
  • Solid understanding of data modeling, data warehousing, ETL/ELT design, and data lakes.
  • Experience with big data technologies like Hadoop, Kafka, and Databricks ecosystem.
  • Knowledge of CI/CD pipelines, data orchestration tools (e.g., Apache Airflow), and data governance best practices.
  • Proven experience managing high-performing teams and delivering complex data engineering projects on time.
  • Familiarity with analytics solutions and the ability to translate business needs into technical requirements.
  • Strong communication skills, capable of engaging with both technical teams and senior leadership.
  • Experience supporting presales efforts and client technical discussions is a plus.
  • Bonus: Exposure to machine learning lifecycle and model deployment on Databricks.
This advertiser has chosen not to accept applicants from your region.

Data Engineering Lead

Thane, Maharashtra DMart - Avenue Supermarts Ltd

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

About the organization:

DMart is one of India’s leading retail chains, serving millions of customers across 425+ stores and e-commerce channels throughout India. Our core objective is to offer customers good products at great value. We focus on everyday low pricing, seamless shopping experiences, and data-driven decision making. As we scale our data analytics journey, we’re seeking a Data Engineering Lead to drive our next phase of growth.

Key Responsibilities:

  • Technical Leadership: Lead architecture design for multiple enterprise data initiatives across cloud, on-premises, and hybrid platforms.
  • Data Platform Architecture and Implementation: Define and document multi-layered architectures (Raw, Curated, Analytics) across data warehouses, lake houses, and operational data stores, with an emphasis on cloud-based solutions.
  • Production Management: Directly run daily processes for efficient and reliable delivery of all programs end to end for data sync of the data lake – including oversight of change management for enrichment as per CRs.
  • Platform Expertise: Apply deep knowledge of architecture patterns and principles, performance tuning, and best practices where Snowflake is part of the solution stack.
  • Schema & Data Modelling: Develop conceptual, logical, and physical data models using consistent standards, including dimensional, normalized, and denormalized structures for analytical and operational use cases.
  • Best Practices & Optimization: Provide guidance on data partitioning, indexing, access controls, naming conventions, and performance optimization across multiple platforms.
  • Metadata & Data Lineage: Collaborate with governance and stewardship teams to integrate metadata management, lineage mapping, and catalog tools (Snowflake Catalog or equivalents).
  • Collaboration: Work closely with analysts, governance leads, engineering teams, and application architects to align models with domain requirements and enterprise architecture standards.
  • Documentation: Maintain architecture blueprints and technical diagrams using tools such as Draw.io or equivalent.
  • Mentorship: Guide and mentor engineers, modelers, and other technical resources on architectural best practices.

Requisites:

  • Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience: 12+ years of IT experience with a strong background in data engineering, modeling, and architecture. 5+ years of experience designing and implementing enterprise data platforms (cloud/on-prem), including at least 3 years with Snowflake.
  • Proficiency in SQL, schema design, and data modeling tools. Certifications in Snowflake incl data modelling, data extracts/ingestion will be given preference.
  • Strong knowledge of ELT/ETL patterns, data integration, and orchestration tools (e.g., Airflow).
  • Integration Experience with ETL and ELT tools and applications incl SAP ABAP Experience with streaming data platforms and OData. Experience with CDC patterns.
  • Experience designing data layers (Raw, Curated, Analytics) with governance and scalability in mind.
  • Familiarity with security and compliance frameworks, including RBAC and regulatory requirements.
  • Proven track record in project delivery, including performance optimization and handling large-scale dataset.
  • Experience with SAP R3 or SAP BW and Google BigQuery.
  • Certifications in cloud platforms or enterprise architecture (e.g., TOGAF, DAMA, SnowPro, Azure, GCP).
  • Experience integrating data platforms with BI tools (e.g., Power BI, Tableau).
  • Knowledge of MDM, data mesh concepts, and even
This advertiser has chosen not to accept applicants from your region.

Data Engineering Director

Mumbai, Maharashtra eClerx

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Senior Data Engineering Lead (Databricks)

Company Overview:

At eClerx, we are a leading IT firm specializing in innovative technologies and solutions that drive business transformation. Leveraging expertise in business process management, advanced analytics, and smart automation, we empower our clients to achieve operational excellence and competitive advantage in fast-evolving markets.

Role Overview:

We are seeking a highly experienced Senior Data Engineering Lead with a strong focus on Databricks and cloud-based data engineering to lead our data engineering team. This leadership role requires a visionary who can design, develop, and manage scalable data infrastructure and pipelines, while mentoring and inspiring a team of data engineers. You will work closely with cross-functional teams including data scientists, analysts, and software engineers to enable robust data-driven decision-making and support business goals.

Key Responsibilities:

  • Lead and manage a team of data engineers, providing mentorship, technical guidance, and fostering a culture of collaboration and innovation.
  • Architect, design, and oversee implementation of large-scale data pipelines, data lakes, and cloud-based data warehouses using Databricks, Apache Spark, and Snowflake.
  • Develop and optimize ETL/ELT workflows ensuring performance, reliability, and scalability of data infrastructure.
  • Collaborate with business stakeholders, data scientists, and software teams to understand requirements and translate them into scalable, efficient data solutions.
  • Implement best practices for data quality, governance, security, and compliance.
  • Drive continuous improvement of data engineering processes, standards, and tools across the organization.
  • Support presales activities by contributing to RFPs, technical proposals, and client engagements.
  • Stay abreast of emerging data technologies and trends, recommending innovative solutions to enhance analytics capabilities.
  • Manage resource planning, project prioritization, and delivery timelines ensuring alignment with business objectives.
  • Lead performance reviews, identify skill gaps, and champion professional development within the data engineering team.
  • Facilitate cross-team communication to streamline data workflows and improve overall delivery.

Qualifications & Skills:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related discipline.
  • Minimum 15 years of professional experience in data engineering with at least 9 years in leadership or senior technical roles.
  • Deep hands-on expertise with Databricks and Apache Spark for large-scale data processing.
  • Strong programming skills in Python, Scala, or Java.
  • Extensive experience with cloud data platforms such as AWS, Azure, or GCP, including services like S3, Redshift, BigQuery, Snowflake.
  • Solid understanding of data modeling, data warehousing, ETL/ELT design, and data lakes.
  • Experience with big data technologies like Hadoop, Kafka, and Databricks ecosystem.
  • Knowledge of CI/CD pipelines, data orchestration tools (e.g., Apache Airflow), and data governance best practices.
  • Proven experience managing high-performing teams and delivering complex data engineering projects on time.
  • Familiarity with analytics solutions and the ability to translate business needs into technical requirements.
  • Strong communication skills, capable of engaging with both technical teams and senior leadership.
  • Experience supporting presales efforts and client technical discussions is a plus.
  • Bonus: Exposure to machine learning lifecycle and model deployment on Databricks.
This advertiser has chosen not to accept applicants from your region.

Intern - Data Engineering

Mumbai, Maharashtra Ingenius Technologies and Consulting

Posted today

Job Viewed

Tap Again To Close

Job Description

About Us
We are an innovative AI SaaS venture that develops cutting-edge AI solutions and provides expert consulting services. Our mission is to empower businesses with state-of-the-art AI technologies and data-driven insights. We're seeking a talented Data Engineer to join our team and help drive our product development and consulting initiatives.

Job Overview
For our Q4 2025 and 2026+ ambition, we are looking for a motivated Intern in Data Engineering (Azure). You will assist in building and maintaining foundational data pipelines and architectures under the guidance of senior team members. This role focuses on learning Azure tools (ADF, Databricks, Pyspark, Scala, python), supporting data ingestion/transformation workflows, and contributing to scalable solutions for AI-driven projects.

Tasks

Tasks

  • Develop basic data pipelines using Azure Data Factory , Azure Synapse Analytics , or Azure Databricks .
  • Assist in ingesting structured/semi-structured data from sources (e.g., APIs, databases, files) into Azure Data Lake Storage (ADLS) .
  • Write simple SQL queries and scripts for data transformation and validation.
  • Write simple Pyspark, scala and python code if required
  • Monitor pipeline performance and troubleshoot basic issues.
  • Collaborate with AI/ML teams to prepare datasets for model training.
  • Document workflows and adhere to data governance standards.
Requirements

Preferred Qualifications

  • Basic knowledge of AI/ML concepts.
  • Bachelor in any stream mentioned in (Engineering, Science & Commerce).
  • Basic understanding of Azure services (Data Factory, Synapse, ADLS, SQL Database, Databricks, Azure ML).
  • Familiarity with SQL, Python, or Pyspark, Scala for scripting.
  • Exposure to data modeling and ETL/ELT processes.
  • Ability to work in Agile/Scrum teams
Benefits

What We Offer

  • Cutting-edge Technology: Opportunity to work on cutting-edge AI projects and shape the future of data visualization
  • Rapid Growth: Be part of a high-growth startup with endless opportunities for career advancement.
  • Impactful Work: See your contributions make a real difference in how businesses operate.
  • Collaborative Culture: Join a diverse team of brilliant minds from around the world.
  • Flexible Work Environment: Enjoy remote work options and a healthy work-life balance.
  • Competitive Compensation as per market.

We’re excited to welcome passionate, driven individuals who are eager to learn and grow with our team. If you’re ready to gain hands-on experience, contribute to meaningful projects, and take the next step in your professional journey, we encourage you to apply. We look forward to exploring the possibility of having you onboard.

Follow us for more updates:

This advertiser has chosen not to accept applicants from your region.

Associate - Data Engineering

Mumbai, Maharashtra Apollo Global Management, Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Position Overview

ABOUT APOLLO

Apollo is a high-growth, global alternative asset manager. In our asset management business, we seek to provide our clients excess return at every point along the risk-reward spectrum from investment grade to private equity with a focus on three investing strategies: yield, hybrid, and equity. For more than three decades, our investing expertise across our fully integrated platform has served the financial return needs of our clients and provided businesses with innovative capital solutions for growth. Through Athene, our retirement services business, we specialize in helping clients achieve financial security by providing a suite of retirement savings products and acting as a solutions provider to institutions. Our patient, creative, and knowledgeable approach to investing aligns our clients, businesses we invest in, our employees, and the communities we impact, to expand opportunity and achieve positive outcomes.

OUR PURPOSE AND CORE VALUES

Our clients rely on our investment acumen to help secure their future. We must never lose our focus and determination to be the best investors and most trusted partners on their behalf. We strive to be:

The leading provider of retirement income solutions to institutions, companies, and individuals.

The leading provider of capital solutions to companies. Our breadth and scale enable us to deliver capital for even the largest projects – and our small firm mindset ensures we will be a thoughtful and dedicated partner to these organizations. We are committed to helping them build stronger businesses.

A leading contributor to addressing some of the biggest issues facing the world today – such as energy transition, accelerating the adoption of new technologies, and social impact – where innovative approaches to investing can make a positive difference.

We are building a unique firm of extraordinary colleagues who:

Outperform expectations.

Challenge Convention

Champion Opportunity

Lead responsibly.

Drive collaboration

As One Apollo team, we believe that doing great work and having fun go hand in hand, and we are proud of what we can achieve together.

OUR BENEFITS

Apollo relies on its people to keep it a leader in alternative investment management, and the firm’s benefit programs are crafted to offer meaningful coverage for both you and your family. Please reach out to your Human Capital Business Partner for more detailed information on specific benefits.

  • Analyze business requirements and API contracts to build APIs to meet business needs and regulatory and compliance requirements.

  • Understand and apply Twelve-Factor App methodology principles in developing REST APIs using various Spring Boot, Java frameworks and Azure API Management platform.

  • Create API documentation that is onboarding to Developer portal

  • Use API Management platform to design and implement requirements of the API layer. ex. policies that will cover security, caching, limits, logging, request, and response modifications

  • Maintain programming standards and ensure the usage of Framework pattern for API services

  • Conduct code reviews and build automatic test coverage

  • Develop the CI/CD pipeline for API management tools and code deployment.

  • Utilize problem-solving skills to help your peers in the research and selection of tools, products, and frameworks (which is vital to support business initiatives)

  • Will manage large data API requests

  • Monitor the security of data and API consumption

  • Ensuring stability of API and APIM performance and maintain SLAS

  • Implement OAuth Okta integration for communication between API producers and consumers.

  • Qualifications & Experience

  • 5+ years of proven industry experience; Masters or bachelor’s degree in IT or related fields

  • Strong hands-on development expertise in Java, GraphQL, Junit, Springboot, OpenAPI, SQL, Java, Python, Spark, Flink, Kafka

  • Strong understanding of Twelve-Factor App Methodology

  • Design/Write object-oriented, modularized, clean and maintainable code

  • Good understanding of Integration knowledge of backend, Front end and other 3rd party applications.

  • Solid understanding of API and integration design principles and pattern experience with web technologies.

  • Design object-oriented, modularized, clean, and maintainable code and creating policies in Java, JavaScript, Node JS, Python etc.

  • Experience implementing requirements of the API layer like security, throttling, OAuth 2.0, TLS, certificates, Azure KeyVault, caching, logging, request, and response modifications etc. using API management platform.

  • Experience creating custom policies in Java, JavaScript, Node JS, Python etc. in API management platform.

  • Experience with test-driven development

  • Demonstrated track record of full project lifecycle and development, as well as post-implementation support activities

  • Significant experience of designing, deploying, and supporting production cloud environments like Azure and Kubernetes

  • Experience with Azure DevOps CI/CD Tools to build and deploy Java/API packages

  • Hands-on experience in designing and developing high volume REST using API Protocols and Data Formats.

  • Good understanding of Databases, API Frameworks, Governance Frameworks, and expertise in hosting and managing platform environments like: Spark, Flink, Kafka, SpringBoot, BI Tools like Tableau, Alteryx, Governance Tools like Callibra, Soda, Amazon DeeQu

  • Knowledge of Agile and DevOps methodologies.

  • Additional Qualifications

  • Experience with Azure API and DB Platforms

  • Familiar in NoSQLNewSQL databases

  • Strong documentation capability and adherence to testing and release management standards

  • Hosting and Managing Frameworks: Spark, Flink, Kafka, Spring Boot, BI Tools like Tableau, Alteryx, Governance Tools like Callibra, Soda, DeeQu

  • Design, development, modification and testing of databases designed to support Data Warehousing and BI business teams

  • Strong documentation capability and adherence to testing and release management standards

  • Familiarity with SDLC methodologies, defect tracking (JIRA, Azure DevOps, ServiceNow etc.)

  • Soft Skills:

  • Candidate must have an analytical and logical thought process for developing project solutions

  • Strong interpersonal and communication skills; works well in a team environment

  • Ability to deliver under competing priorities and pressures.

  • Excellent organizational skills in the areas of code structuring & partitioning, commenting and documentation for team alignment and modifications

  • Apollo provides equal employment opportunities regardless of age, disability, gender reassignment, marital or civil partner status, pregnancy or maternity, race, color, nationality, ethnic or national origin, religion or belief, veteran status, gender/sex or sexual orientation, or any other criterion or circumstance protected by applicable law, ordinance, or regulation. The above criteria are intended to be used as a guide only – candidates who do not meet all the above criteria may still be considered if they are deemed to have relevant experience/ equivalent levels of skill or knowledge to fulfil the requirements of the role. Any job offer will be conditional upon and subject to satisfactory reference and background screening checks, all necessary corporate and regulatory approvals or certifications as required from time to time and entering into definitive contractual documentation satisfactory to Apollo.

    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Etl processes Jobs in Mumbai !

    Data Engineering Manager

    Mumbai, Maharashtra UnitedHealth Group

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.

    We are looking for a skilled Data Engineer to design, build, and maintain scalable, secure, and high-performance data solutions. This role spans the full data engineering lifecycle - from research and architecture to deployment and support- within cloud-native environments, with a strong focus on AWS and Kubernetes (EKS). 

    Primary Responsibilities:

  • Data Engineering Lifecycle: Lead research, proof of concept, architecture, development, testing, deployment, and ongoing maintenance of data solutions
  • Data Solutions: Design and implement modular, flexible, secure, and reliable data systems that scale with business needs
  • Instrumentation and Monitoring: Integrate pipeline observability to detect and resolve issues proactively
  • Troubleshooting and Optimization: Develop tools and processes to debug, optimize, and maintain production systems
  • Tech Debt Reduction: Identify and address legacy inefficiencies to improve performance and maintainability
  • Debugging and Troubleshooting: Quickly diagnose and resolve unknown issues across complex systems
  • Documentation and Governance: Maintain clear documentation of data models, transformations, and pipelines to ensure security and governance compliance
  • Cloud Expertise: Leverage advanced skills in AWS and EKS to build, deploy, and scale cloud-native data platforms
  • Cross-Functional Support: Collaborate with analytics, application development, and business teams to enable data-driven solutions 
  • Team Leadership: Lead and mentor engineering teams to ensure operational efficiency and innovation
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
  • Required Qualifications:

  • Bachelor’s degree in Computer Science or related field
  • 5+ years of experience in data engineering or related roles
  • Proven experience designing and deploying scalable, secure, high-quality data solutions
  • Solid expertise in full Data Engineering lifecycle (research to maintenance)
  • Advanced AWS and EKS knowledge
  • Proficient in CI/CD, IaC, and addressing tech debt
  • Proven skilled in monitoring and instrumentation of data pipelines
  • Proven advanced troubleshooting and performance optimization abilities
  • Proven ownership mindset with ability to manage multiple components
  • Proven effective cross-functional collaborator (DS, SMEs, and external teams).
  • Proven exceptional debugging and problem-solving skills
  • Proven solid individual contributor with a team-first approach
  • At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

    #njp

    This advertiser has chosen not to accept applicants from your region.

    Data Engineering Specialist

    New
    Mumbai, Maharashtra Investec

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Investec is a distinctive Specialist Bank serving clients principally in the UK and South Africa. Our culture gives us our edge: we work hard to find colleagues who'll think out of the ordinary and we put them in environments where they'll flourish. We combine a flat structure with a focus on internal mobility. If you can bring an entrepreneurial spirit and a desire to learn and collaborate to your work, this could be the boost your career deserves.

    Team Description: 

    The Offshore Data Engineering Lead will be responsible for overseeing the data and application development efforts which support our Microsoft Data Mesh Platform. Working as a part of the Investec Central Data Team, the candidate will be responsible for leading development on solutions and applications that support our data domain teams with creation of data products. This role involves driving technical initiatives, exploring new technologies, and enhancing engineering practices within the data teams in-line with the group engineering strategy. The Data Engineering Lead will be a key driver for Investec's move to Microsoft Fabric and other enablement data quality, data management and data orchestration technologies.

    Key Roles and Responsibilities: 

  • Lead the development and implementation of data and custom application solutions that support the creation of data products across various data domains. 
  • Design, build, and maintain data pipelines using Microsoft Azure Data Platform, Microsoft Fabric and Databricks technologies. 
  • Ensure data quality, integrity, and security within the data mesh architecture.
  • Share group engineering context with the CIO and engineers within the business unit continuously.
  • Drive engineering efficiency and enable teams to deliver high-quality software quickly within the business unit
  • Cultivate a culture focused on security, risk management, and best practices in engineering
  • Actively engage with the data domain teams, business units and wider engineering community to promote knowledge sharing
  • Spearhead technical projects and innovation within the business unit's engineering teams and contribute to group engineering initiatives
  • Advance the technical skills of the engineering community and mentor engineers within the business unit
  • Enhance the stability, performance, and security of the business unit's systems.
  • Develop and promote exceptional engineering documentation and practices
  • Build a culture of development and mentorship within the central data team
  • Provide guidance on technology and engineering practices
  • Actively encourages creating Investec open-source software where appropriate within the business unit
  • Actively encourages team members within the business unit to speak at technical conferences based on the work being done
  • Core Skills and Knowledge: 

  • Proven experience in data engineering, with a strong focus on Microsoft Data Platform technologies, including Azure Data Factory, Azure SQL Database, and Databricks
  • Proficiency in programming languages such as C# and/or Python, with experience in application development being a plus
  • Experience with CI/CD pipelines, Azure, and Azure DevOps
  • Strong experience and knowledge with PySpark and SQL with the ability to create solutions using Microsoft Fabric
  • Ability to create solutions that query and work with web API's
  • In-depth knowledge of Azure, containerisation, and Kubernetes
  • Strong understanding of data architecture concepts, particularly data mesh principles
  • Excellent problem-solving skills and the ability to work independently as a self-starter
  • Strong communication and collaboration skills, with the ability to work effectively in a remote team environment
  • Relevant degree in Computer Science, Data Engineering, or a related field is preferred
  • As part of our collaborative & agile culture, our working week is 4 days in the office and one day remote.

    Investec offers a range of wellbeing benefits to make our people feel healthier, balanced and more fulfilled in their lives inside and outside of work.

    Embedded in our culture is a sense of belonging and inclusion. This creates an environment in which everyone is free to be themselves which helps to drive innovation, creativity and ultimately business performance. At Investec we want everyone to find it easy to be themselves, and to feel they belong. It's a responsibility we all share and is integral to our purpose and values as an organisation.

    Recite Me

    The Recite Me tool includes a screen reader, styling and customisation options, a series of reading aids, a translator and more.


    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Etl Processes Jobs View All Jobs in Mumbai