1,605 Etl Processes jobs in India

Data Engineering - Data Engineering Data Engineering

New Delhi, Delhi Generis Tek Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Please contact: To discuss this amazing opportunity, reach out to our Talent Acquisition Specialist Rushi Panchal  at email address   can be reached on # .
 
We have Contract role Data Engineer-Remote f or our client at New Delhi. Please let me know if you or any of your friends would be interested in this position.
 
Position Details:
Data Engineer-Remote-New Delhi
Location                            : (Remote)
Project Duration              : 06 months Contract
 
Job Description:
We are seeking a skilled Data Engineer, who is  knowledgeable about and loves working with modern data integration frameworks, big data, and cloud technologies. Candidates must also be proficient with data programming languages (e.g., Python and SQL). The Yum! data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects—with the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally. The candidate will work in our office in Gurgaon, India. 
 
Key Responsibilities 
As a data engineer, you will:
•    Partner with KFC, Pizza Hut, Taco Bell & Habit Burger to build data pipelines to enable best-in-class restaurant technology solutions.
•    Play a key role in our Data Operations team—developing data solutions responsible for driving Yum! growth.
•    Design and develop data pipelines—streaming and batch—to move data from point-of-sale, back of house, operational platforms, and more to our Global Data Hub
•    Contribute to standardizing and developing a framework to extend these pipelines across brands and markets
•    Develop on the Yum! data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.).
•    Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting, and other data integration points.

Skills and Qualifications:
•    Vast background in all things data-related
•    AWS platform development experience (EKS, S3, API Gateway, Lambda, etc.)
•    Experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus
•    High level of proficiency with SQL (Snowflake a big plus)
•    Proficiency with Python for transforming data and automating tasks
•    Experience with Kafka, Pulsar, or other streaming technologies
•    Experience orchestrating complex task flows across a variety of technologies
•    Bachelor’s degree from an accredited institution or relevant experience
 
 
To discuss this amazing opportunity, reach out to our Talent Acquisition Specialist Rushi Panchal   at email address   can be reached on #    .
 
 
About generis tek : generis tek is a boutique it/professional staffing based in Chicago land. We offer both contingent labor & permanent placement services to several fortune 500 clients nationwide. Our philosophy is based on delivering long-term value and build lasting relationships with our clients, consultants and employees. Our fundamental success lies in understanding our clients’ specific needs and working very closely with our consultants to create a right fit for both sides. We aspire to be our client has most trusted business partner.
 
This advertiser has chosen not to accept applicants from your region.

Data Engineering

Mumbai, Maharashtra NR Consulting - India

Posted today

Job Viewed

Tap Again To Close

Job Description

Title: Data Engineering
Location: Mumbai

Job Description: Data Engineering

This advertiser has chosen not to accept applicants from your region.

Data Engineering

Bengaluru, Karnataka ScaleneWorks

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Middleware Engineer
Position: Data Engineer
Experience: 5-6yrs
Category: IT Infrastructure
Main location: India, Karnataka, Bangalore
Employment Type: Full Time
Qualification: Bachelor's degree in Computer Science or related field or higher.
Roles and Responsibilities


Data Engineer - 5-6 years experience.
Responsibilities
===
Design, develop, and maintain data architectures, pipelines, and workflows for the collection, processing, storage, and retrieval of large volumes of structured and unstructured data from multiple sources.
Collaborate with cross-functional teams to identify and prioritize data engineering requirements and to develop and deploy data-driven solutions to address business challenges.
Build and maintain scalable data storage and retrieval systems (e.g., data lakes, data warehouses, databases), fault-tolerant, and high-performance data platforms on cloud infrastructure such as AWS, Azure, or Google Cloud Platform.
Develop and maintain ETL workflows, data pipelines, and data transformation processes to prepare data for machine learning and AI applications.
Implement and optimize distributed computing frameworks such as Hadoop, Spark, or Flink to support high-performance and scalable processing of large data sets.
Build and maintain monitoring, alerting, and logging systems to ensure the availability, reliability, and performance of data pipelines and data platforms.
Collaborate with Data Scientists and Machine Learning Engineers to deploy models on production environments and ensure their scalability, reliability, and accuracy.
Requirements:
===
Bachelor s or master s degree in computer science, engineering, or related field.
At least 5-6 years of experience in data engineering, with a strong background in machine learning, cloud computing and big data technologies.
Experience with at least one major cloud platform (AWS, Azure, GCP).
Proficiency in programming languages like Python, Java, and SQL.
Experience with distributed computing technologies such as Hadoop, Spark, and Kafka.
Familiarity with database technologies such as SQL, NoSQL, NewSQL.
Experience with data warehousing and ETL tools such as Redshift, Snowflake, or Airflow.
Strong problem-solving and analytical skills.
Excellent communication and teamwork skills.
Preferred qualification:
===
Experience with DevOps practices and tools such as Docker, Kubernetes, or Ansible, Terraform.
Experience with data visualization tools such as Tableau, Superset, Power BI, or Plotly, D3.js.
Experience with stream processing frameworks such as Kafka, Pulsar or Kinesis.
Experience with data governance, data security, and compliance.
Experience with software engineering best practices and methodologies such as Agile or Scrum.
Must Have Skills
===
data engineer with expertise in machine learning, cloud computing , and big data technologies.
Data Engineering Experince on multiple clouds one of them , preferably GCP
data lakes, data warehouses, databases
ETL workflows, data pipelines,data platforms
Hadoop, Spark, or Flink
Hadoop, Spark, and Kafka
SQL, NoSQL, NewSQL
Redshift, Snowflake, or Airflow

This advertiser has chosen not to accept applicants from your region.

Data Engineering Manager

Mumbai, Maharashtra UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
We are looking for a skilled Data Engineer to design, build, and maintain scalable, secure, and high-performance data solutions. This role spans the full data engineering lifecycle - from research and architecture to deployment and support- within cloud-native environments, with a strong focus on AWS and Kubernetes (EKS).
**Primary Responsibilities:**
+ **Data Engineering Lifecycle:** Lead research, proof of concept, architecture, development, testing, deployment, and ongoing maintenance of data solutions
+ **Data Solutions:** Design and implement modular, flexible, secure, and reliable data systems that scale with business needs
+ **Instrumentation and Monitoring:** Integrate pipeline observability to detect and resolve issues proactively
+ **Troubleshooting and Optimization:** Develop tools and processes to debug, optimize, and maintain production systems
+ **Tech Debt Reduction:** Identify and address legacy inefficiencies to improve performance and maintainability
+ **Debugging and Troubleshooting:** Quickly diagnose and resolve unknown issues across complex systems
+ **Documentation and Governance:** Maintain clear documentation of data models, transformations, and pipelines to ensure security and governance compliance
+ **Cloud Expertise:** Leverage advanced skills in AWS and EKS to build, deploy, and scale cloud-native data platforms
+ **Cross-Functional Support:** Collaborate with analytics, application development, and business teams to enable data-driven solutions
+ **Team Leadership:** Lead and mentor engineering teams to ensure operational efficiency and innovation
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Bachelor's degree in Computer Science or related field
+ 5+ years of experience in data engineering or related roles
+ Proven experience designing and deploying scalable, secure, high-quality data solutions
+ Solid expertise in full Data Engineering lifecycle (research to maintenance)
+ Advanced AWS and EKS knowledge
+ Proficient in CI/CD, IaC, and addressing tech debt
+ Proven skilled in monitoring and instrumentation of data pipelines
+ Proven advanced troubleshooting and performance optimization abilities
+ Proven ownership mindset with ability to manage multiple components
+ Proven effective cross-functional collaborator (DS, SMEs, and external teams).
+ Proven exceptional debugging and problem-solving skills
+ Proven solid individual contributor with a team-first approach
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
_#njp_
This advertiser has chosen not to accept applicants from your region.

Data Engineering Analyst

Chennai, Tamil Nadu UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
The Data Engineering Analyst, using technical and analytical skills, is responsible for supporting Optum members on, not limiting to, ongoing data refreshes and implementations which are delivered on time and with utmost quality, complete analysis of an issue to its final solution, including creative problem solving and technical decision making.
**Primary Responsibility:**
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Bachelor's degree in Computer Science or any engineering
+ 2+ years of experience in Data analysis and functional QC
+ Basic Knowledge on Cloud (AWS)
+ Basic knowledge in Spark Sql
+ Basic Knowledge in Python
+ Basic US Healthcare knowledge
+ Fair knowledge on Cloud (AWS)
+ Technical aptitude for learning new technologies
+ Solid SQL skills
+ Proven solid analytical and problem-solving skills
+ Proven passion to work with lot of data in a challenging environment
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
This advertiser has chosen not to accept applicants from your region.

Data Engineering Consultant

Noida, Uttar Pradesh UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
**Primary Responsibilities:**
+ Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure
+ Design and develop Azure Databricks processes using PySpark/Spark-SQL
+ Design and develop orchestration jobs using ADF, Databricks Workflow
+ Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve
+ Building test framework for the Databricks notebook jobs for automated testing before code deployment
+ Design and build POCs to validate new ideas, tools, and architectures in Azure
+ Continuously explore new Azure services and capabilities; assess their applicability to business needs
+ Prepare case studies and technical write-ups to showcase successful implementations and lessons learned
+ Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture
+ Contribute to full lifecycle project implementations, from design and development to deployment and monitoring
+ Ensure solutions adhere to security, compliance, and governance standards
+ Monitor and optimize data pipelines and cloud resources for cost and performance efficiency
+ Identifies solutions to non-standard requests and problems
+ Mentor and support existing on-prem developers for cloud environment
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ 7+ years of overall experience in Data & Analytics engineering
+ 5+ years of experience working with Azure, Databricks, and ADF, Data Lake
+ 5+ years of experience working with data platform or product using PySpark and Spark-SQL
+ Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc.
+ In-depth understanding of Azure architecture & ability to come up with efficient design & solutions
+ Highly proficient in Python and SQL
+ Proven excellent communication skills
+ **Key Skill:** Azure Data Engineer - Azure Databricks, Azure Data factory, Python/Pyspark, Terraform
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
This advertiser has chosen not to accept applicants from your region.

Data Engineering Analyst

Gurgaon, Haryana UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
**Primary Responsibilities:**
+ Data Ingestion: Develop, maintain, and optimize data ingestion pipelines from various sources including Adobe 1.4 and 2.0 APIs, NAS file drives, SQL servers, and Adobe datasets
+ Workflow Orchestration: Design and manage data workflows using Airflow to ensure reliable data movement, transformation, and scheduling
+ Database Development: Build and enhance data models in Snowflake using DBT (Data Build Tool) as well as Cosmos database development
+ Business Intelligence: Create, maintain, and improve Power BI dashboards to deliver actionable business insights
+ Pipeline Management: Monitor, troubleshoot, and optimize end-to-end data pipelines to ensure high availability and accuracy of data delivery
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Bachelor's degree in Computer Science, Information Systems, Engineering, or a related technical field (or equivalent experience)
+ 2+ years of hands-on experience in data engineering or analytics roles
+ Hands-on experience with Snowflake and DBT for data transformation and modeling (exposure to Cosmos DB a plus)
+ Solid experience with Apache Airflow for workflow orchestration and scheduling
+ Expertise in building and maintaining ETL/ELT pipelines from diverse data sources
+ Advanced SQL skills for querying, data manipulation, and database management
+ Skilled in developing, managing, and optimizing dashboards using Power BI
+ Skills in Python, Airflow, DBT, GitHub, ETL, Power BI, SQL, Adobe AEP/CJO
+ Proficient in Python for data engineering, automation, and scripting tasks
+ Proficient in version control and code management using GitHub
**Preferred Qualification:**
+ Familiarity or experience with Adobe AEP/CJO (Adobe Experience Platform / Customer Journey Analytics)
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl processes Jobs in India !

Data Engineering Manager

Hyderabad, Andhra Pradesh UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
**Primary Responsibilities:**
+ Lead a team to design, develop, test, deploy, maintain and continuously improve software
+ Support driving modern solutions to complex problems
+ Proactively share information across the team, to the right audience with the appropriate level of detail and timeliness
+ Implement small and large scale implementations of projects with E2E delivery
+ Facilitate technical discussions and design solutions
+ Technically coach the team members and provide real time solutions to complex problems
+ Adherence to code quality standards, development best practices, defect-free code delivery
+ Collaborate and Communicate effectively with other engineers and onshore team
+ Partner with business and enterprise architecture team to understand requirements and design/develop end to end solution
+ Practice agile solution delivery & influence teams towards engineering best practices such as DevOps / CICD / Everything as a code
+ Follow engineering best practices and coding standards for successful delivery of program/project
+ Partner closely with global teams to implement the design
+ Manages and is accountable for professional employees and/or supervisors
+ Understand product architecture, features being built and come up with product improvement ideas and POCs
+ Individual contributor and Technical lead for Data Engineering - Data pipelines, Data modelling and Data warehouse
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ 5+ years of Spark/Scala experience
+ 2+ years of hands-on experience in production public cloud (Azure, AWS or GCP)
+ Hands-on experience in working with standard DevOps tools:
+ Build automation tools such as Hudson, Jenkins, Travis CI or equivalent
+ Source control tools such as Git, GitHub, SVN or equivalent
+ Configuration tools such as Chef, Puppet, Terraform, Ansible
+ Hands-on in designing software solutions and architecture
+ Hands-on in using design patterns
+ Hands-on knowledge in trouble shooting, performance tuning, and optimization
+ Solid, hands-on knowledge of Angular/ReactJS
+ Experience working in cross-functional teams - Solution Design, Development, Quality Engineering and DevOps
+ Proven solid analytical, problem solving and decision-making skills
+ Proven excellent verbal, written and interpersonal communication skills
+ Proven ability to work collaboratively in a global team with a positive team spirit
**Preferred Qualifications:**
+ Work experience in Agile/Scrum Methodology
+ Work experience in product engineering
+ Knowledge of US Healthcare domain, in general and data analytics applications/products in particular
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
This advertiser has chosen not to accept applicants from your region.

Data Engineering manager

Pune, Maharashtra Panasonic Avionics Corporation

Posted today

Job Viewed

Tap Again To Close

Job Description

**Overview**
Proven leadership in building and mentoring high-performing data engineering teams to deliver scalable, secure, and high-performance data solutions. Experienced in architecting cloud-native data platforms on AWS, leveraging services like EMR, Glue, Redshift, and SageMaker. Skilled in Lakehouse architecture, real-time streaming (Kafka, Kinesis, Flink), orchestration with Apache Airflow, and advanced analytics enablement.
Strong focus on data governance, MDM, and infrastructure automation using GitLab CI/CD, Terraform, and IaC. Adept at designing robust middleware APIs to integrate data pipelines with applications and analytics platforms. Collaborative partner to business, AI/ML, and BI teams-delivering end-to-end data solutions that power insights, innovation, and strategic decision-making.
**Responsibilities**
**Data Engineering Leadership**
+ Lead and mentor a team of data engineers in developing and managing **scalable, secure, and high-performance data pipelines** .
+ Define best practices for **data ingestion, transformation, and processing** in a **Lakehouse architecture** .
+ Drive **automation, performance tuning, and cost optimization** in cloud data solutions.
**Cloud Data Infrastructure & Processing**
+ Architect and manage **AWS-based big data solutions** (EMR, EKS, Glue, Redshift).
+ Design and maintain **Apache Airflow** workflows for data orchestration.
+ Optimize **Spark and distributed data processing frameworks** for large-scale workloads.
+ Implement **streaming solutions** (Kafka, Kinesis, Flink) for real-time data processing.
**AI/ML & Advanced Analytics**
+ Collaborate with **Data Scientists and AI/ML teams** to build and deploy **machine learning models** using **AWS SageMaker** .
+ Support **feature engineering, model training, and inference pipelines** at scale.
+ Enable AI-driven analytics by integrating structured and unstructured data sources.
**Business Intelligence & Visualization**
+ Support **BI and reporting teams** with optimized **data models** for **Amazon QuickSight and other visualization tools** .
+ Ensure efficient **data aggregation and pre-processing** for interactive dashboards and self-service analytics.
+ Design, develop, and maintain **middleware components** that facilitate seamless communication between **data platforms, applications, and analytics layers** .
**Master Data Management (MDM) & Governance**
+ Implement **MDM strategies** to ensure clean, consistent, and deduplicated data.
+ Establish **data governance policies** for security, privacy, and compliance (GDPR, HIPAA, etc.).
+ Ensure adherence to **data quality frameworks** across structured and unstructured datasets.
**Collaboration & Strategy**
+ Partner with **business teams, AI/ML teams, and analysts** to deliver **high-value data products** .
+ Define and maintain **data architecture strategies** aligned with business goals.
+ Enable **real-time and batch processing** for analytics, reporting, and AI-driven insights. **Technical Expertise:**
+ Extensive AWS experience with services such as EMR, EKS, Glue, Redshift, S3, Lambda, and SageMaker.
+ Proficient in big data processing frameworks (e.g., Spark, Hive, Presto) and Lakehouse architectures.
+ Skilled in designing and managing Apache Airflow workflows and other orchestration tools.
+ Solid understanding of Master Data Management (MDM) and data governance best practices.
+ Proficient with SQL & NoSQL databases (e.g., Redshift, DynamoDB, PostgreSQL, Elasticsearch).
+ **Middleware Development** - Proven expertise in **building middleware components** like REST API that integrate data pipelines with applications, analytics platforms, and real-time systems.
+ Hands-on experience with Gitlab CI/CD, Terraform, CFT, and Infrastructure-as-Code (IaC) methodologies.
+ Familiarity with AI/ML pipelines, model deployment, and monitoring using SageMaker.
+ Experience with data visualization tools, particularly AWS QuickSight, for business intelligence.
**Qualifications**
Experience with Lakehouse frameworks (Glue Catalog, Iceberg, Delta Lake).
Expertise in streaming data solutions (Kafka, Kinesis, Flink).
In-depth understanding of security best practices in AWS data architectures.
Demonstrated success in driving AI/ML initiatives from ideation to production.
**Educational Qualification:**
+ **Bachelor's degree or higher (UG+) in Computer Science, Data Engineering, Aerospace Engineering, or a related field.**
+ Advanced degrees (Master's, PhD) in **Data Science or AI/ML** are a plus.
REQ-145778
This advertiser has chosen not to accept applicants from your region.

Director, Data Engineering

Pune, Maharashtra Mastercard

Posted today

Job Viewed

Tap Again To Close

Job Description

**Our Purpose**
_Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential._
**Title and Summary**
Director, Data Engineering
Position Overview:
Director, Data Engineer, MyMPA will be part of GBSC's Enterprise Performance Management Systems (EPMS) Team, leading the strategic vision, architecture, and execution of our enterprise data infrastructure. You will oversee a team of high-performing engineers focused on implementing enhancements and periodic refreshes of a platform reporting on Mastercard card data performance.
This role will also work closely with the VP of Sales Excellence, VP of Analytics & Metrics and the Director of FP&A to gather requirements for changes and enhancements to the application and contribute to the technology platform's evolution as it grows to support the rapidly expanding Mastercard business?
The ideal candidate will have hands-on development skills combined with an ability to analyze and understand end user requirements as both are critical success factors within this role. This role requires the skills and desire to work as an individual contributor as well as collaborate cross functionally with various business constituents.
1. Have you ever worked on an enterprise-wide reporting solution that relied heavily on your own knowledge and resources to build and maintain the solution?
2. Are you constantly hungry to learn? Do you have a "growth mindset" as opposed to the "fixed mindset"?
3. Do you love working with people, helping them, and turning their requirements into something that can make a difference?
Role:
- Responsible for designing & implementing solutions, managing team and client services.
- Develop and deploy Essbase solutions with best-in-class coding practices
- Define and drive the long-term data engineering roadmap aligned with business goals.
- Ensure data quality, governance, and security best practices are embedded in all engineering processes.
- Oversee the integration of structured and unstructured data from diverse sources to enable advanced analytics and AI/ML capabilities.
- Lead and mentor a team of senior data engineers and managers, fostering a culture of innovation, accountability, and continuous improvement.
- Liaison with the internal groups in Mastercard Technology to ensure our solutions remain in compliance with MasterCard technical standards. Navigate Mastercard Technology requirements around change management and new development.
All About You:
- Proven track record of building and scaling enterprise-grade data platforms in cloud environments.
- Excellent communication skills, capable of translating complex technical concepts into business values.
- Experience deploying applications across both Windows and Linux environments.
- Solid understanding of Essbase technology - understand how this technology works, for both BSO and ASO cubes.
- Good understanding of SQL Server or Oracle DB.
- Develop BSO and ASO cubes with a strong eye for performance.
- Strong commitment to quality and error testing of code you develop. Strong ability to step in and analyze the code of others on the team.
- Be able to work within an Agile environment that is highly responsive to the business. Our team is part of the Finance organization - you must be comfortable working as part of the business with a strong "roll up your sleeves" mentality.
- A transformation mindset, looking to evolve system solutions to capitalize on new technologies to improve and augment existing solutions.
**Corporate Security Responsibility**
All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:
+ Abide by Mastercard's security policies and practices;
+ Ensure the confidentiality and integrity of the information being accessed;
+ Report any suspected information security violation or breach, and
+ Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Processes Jobs