57,119 Data Team jobs in India

Director Data Science & Data Engineering

Bengaluru, Karnataka eBay

Posted today

Job Viewed

Tap Again To Close

Job Description

At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts.

Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet.

Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all.

Director – Data Science & Data Engineering
Shape the Future of AI-Driven eCommerce Discovery

About the Role
We're reimagining how people discover products in eCommerce—and we're looking for a visionary leader who blends technical depth with product intuition. If you're passionate about structured data, large language models, and building high-impact data products, this role is tailor-made for you.

As Director of Data Science & Data Engineering, you’ll lead a talented team of data scientists, analysts, and engineers working at the cutting edge of AI/ML, product analytics, and taxonomy design. Your mission? Drive innovation in product discovery through smarter data, scalable infrastructure, and breakthrough AI-powered solutions.

You’ll join the Product Knowledge org and play a key role in designing the backbone of next-gen search, recommendations, and generative AI experiences.

This is a high-impact, high-agency role—perfect for a hands-on leader who thrives in fast-paced, collaborative environments.

What You’ll Work On

Lead and inspire a cross-functional team to:

  • Transform Product Data into Insights
    Conduct deep-dive SQL and Python analyses to uncover opportunities in taxonomy, ontology, and catalog structure that enhance discovery and user experience.

  • Harness the Power of Generative AI
    Use prompt engineering and LLMs to create innovative tools for classification, taxonomy validation, and data enrichment.

  • Build & Evaluate AI/ML Models
    Design frameworks to evaluate product knowledge models, semantic embeddings, and ML-based categorization systems.

  • Drive Data-Informed Strategy
    Translate complex findings into clear, actionable insights for Product and Engineering teams. Influence roadmap decisions on entity resolution, catalog optimization, and knowledge graph development.

  • Partner Across Functions
    Collaborate closely with Applied Research, Engineering, and Product teams to build and deploy high-impact data and AI solutions at scale.

  • Experiment & Innovate Fast
    Prototype quickly, validate hypotheses, and iterate on structured data and AI-driven solutions that push boundaries.

  • What You Bring

  • 12+ years of experience in data science or analytics roles, including 5+ years leading teams

  • Proven track record building data products, knowledge graphs, and scalable data pipelines

  • Deep understanding of eCommerce search, recommendation systems, and product analytics

  • Hands-on experience with LLMs, prompt engineering, and RAG techniques (preferred)

  • Strong communication skills and ability to influence cross-functional stakeholders

  • Experience evaluating ML models with custom metrics and robust frameworks

  • Startup mindset—comfortable with ambiguity, bias for action, and fast iteration

  • Why Join Us

  • Be at the forefront of AI-powered product discovery in eCommerce

  • Own high-impact initiatives in a startup-style culture with real autonomy

  • Work alongside world-class talent across AI, Product, and Engineering

  • Build solutions that scale—serving millions of users and shaping the future of shopping

  • Ready to lead the next wave of AI + Data innovation in commerce? Let’s build the future together.

    Please see the for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay.

    eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at . We will make every effort to respond to your request for accommodation as soon as possible. View our to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities.

    The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our for more information.

    This advertiser has chosen not to accept applicants from your region.

    Data Engineering

    Bengaluru, Karnataka ScaleneWorks

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title: Middleware Engineer
    Position: Data Engineer
    Experience: 5-6yrs
    Category: IT Infrastructure
    Main location: India, Karnataka, Bangalore
    Employment Type: Full Time
    Qualification: Bachelor's degree in Computer Science or related field or higher.
    Roles and Responsibilities


    Data Engineer - 5-6 years experience.
    Responsibilities
    ===
    Design, develop, and maintain data architectures, pipelines, and workflows for the collection, processing, storage, and retrieval of large volumes of structured and unstructured data from multiple sources.
    Collaborate with cross-functional teams to identify and prioritize data engineering requirements and to develop and deploy data-driven solutions to address business challenges.
    Build and maintain scalable data storage and retrieval systems (e.g., data lakes, data warehouses, databases), fault-tolerant, and high-performance data platforms on cloud infrastructure such as AWS, Azure, or Google Cloud Platform.
    Develop and maintain ETL workflows, data pipelines, and data transformation processes to prepare data for machine learning and AI applications.
    Implement and optimize distributed computing frameworks such as Hadoop, Spark, or Flink to support high-performance and scalable processing of large data sets.
    Build and maintain monitoring, alerting, and logging systems to ensure the availability, reliability, and performance of data pipelines and data platforms.
    Collaborate with Data Scientists and Machine Learning Engineers to deploy models on production environments and ensure their scalability, reliability, and accuracy.
    Requirements:
    ===
    Bachelor s or master s degree in computer science, engineering, or related field.
    At least 5-6 years of experience in data engineering, with a strong background in machine learning, cloud computing and big data technologies.
    Experience with at least one major cloud platform (AWS, Azure, GCP).
    Proficiency in programming languages like Python, Java, and SQL.
    Experience with distributed computing technologies such as Hadoop, Spark, and Kafka.
    Familiarity with database technologies such as SQL, NoSQL, NewSQL.
    Experience with data warehousing and ETL tools such as Redshift, Snowflake, or Airflow.
    Strong problem-solving and analytical skills.
    Excellent communication and teamwork skills.
    Preferred qualification:
    ===
    Experience with DevOps practices and tools such as Docker, Kubernetes, or Ansible, Terraform.
    Experience with data visualization tools such as Tableau, Superset, Power BI, or Plotly, D3.js.
    Experience with stream processing frameworks such as Kafka, Pulsar or Kinesis.
    Experience with data governance, data security, and compliance.
    Experience with software engineering best practices and methodologies such as Agile or Scrum.
    Must Have Skills
    ===
    data engineer with expertise in machine learning, cloud computing , and big data technologies.
    Data Engineering Experince on multiple clouds one of them , preferably GCP
    data lakes, data warehouses, databases
    ETL workflows, data pipelines,data platforms
    Hadoop, Spark, or Flink
    Hadoop, Spark, and Kafka
    SQL, NoSQL, NewSQL
    Redshift, Snowflake, or Airflow

    This advertiser has chosen not to accept applicants from your region.

    Data Engineering

    Gurgaon, Haryana Confidential

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Role Responsibilities:

    • Design and build scalable data pipelines and architectures
    • Integrate and transform large-scale datasets from varied sources
    • Collaborate with solution designers to implement data ingestion and enrichment
    • Promote coding standards and ensure data quality in agile delivery

    Job Requirements:

    • Strong experience with Hadoop ecosystem (Hive, Spark), Java/Scala
    • Proficient in SQL, Shell Scripting, and version control tools
    • Exposure to AWS cloud infrastructure and data warehouses
    • Familiarity with Agile methodology and automation tools like GitHub and TeamCity

    Skills Required
    data engineering , Hadoop, Spark, Scala, Aws, Sql
    This advertiser has chosen not to accept applicants from your region.

    Data Engineering

    Nagpur, Maharashtra Confidential

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Develop, optimize, and maintain robust ETL pipelines for data ingestion, transformation, and processing.

    Design and implement scalable data solutions on Azure Cloud Services, leveraging tools like Azure Data Factory, Databricks, and Key Vault.

    Work with real-time and batch data processing systems, ensuring data accuracy and availability.

    Collaborate with cross-functional teams to define data engineering requirements and deliver efficient solutions in an Agile environment.

    Perform advanced SQL queries and Python scripting for data extraction, transformation, and analysis.

    Ensure data security, integrity, and compliance with organizational and industry standards.

    Monitor, troubleshoot, and enhance data pipelines using modern tools and techniques.

    Use scheduling tools like Control-M or equivalent to automate and orchestrate workflows.

    Document technical workflows, processes, and troubleshooting guides for stakeholders.

    Requirements

    Proficiency in SQL and Python for advanced data manipulation and automation.

    Strong understanding of data engineering principles, including data modeling, data warehousing, and pipeline optimization.

    Hands-on experience with Databricks and its ecosystem.

    Familiarity with Azure cloud services, particularly Azure Data Factory, Key Vault, and other relevant components.

    Experience working with real-time data systems (e.g., Kafka, Event Hub, or similar).

    Good understanding of Agile methodologies and ability to work effectively in collaborative, fast-paced environments.

    Exposure to Snowflake is added advantage

    Knowledge of scheduling tools like Control-M or equivalent.

    Strong analytical and problem-solving skills with a proactive mindset.

    Excellent communication and teamwork abilities.

    Job responsibilities

    Develop, optimize, and maintain robust ETL pipelines for data ingestion, transformation, and processing.

    Design and implement scalable data solutions on Azure Cloud Services, leveraging tools like Azure Data Factory, Databricks, and Key Vault.

    Work with real-time and batch data processing systems, ensuring data accuracy and availability.

    Collaborate with cross-functional teams to define data engineering requirements and deliver efficient solutions in an Agile environment.

    Perform advanced SQL queries and Python scripting for data extraction, transformation, and analysis.

    Ensure data security, integrity, and compliance with organizational and industry standards.

    Monitor, troubleshoot, and enhance data pipelines using modern tools and techniques.

    Use scheduling tools like Control-M or equivalent to automate and orchestrate workflows.

    Document technical workflows, processes, and troubleshooting guides for stakeholders.

    What we offer

    Culture of caring.  At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you'll experience an inclusive culture of acceptance and belonging, where you'll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. 

    Learning and development.  We are committed to your continuous learning and development. You'll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.

    Interesting & meaningful work.  GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you'll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what's possible and bring new solutions to market. In the process, you'll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.

    Balance and flexibility.  We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!

    High-trust organization.  We are a high-trust organization where integrity is key. By joining GlobalLogic, you're placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.

    About GlobalLogic

    GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we've been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.


    Skills Required
    data engineering , Python, Sql
    This advertiser has chosen not to accept applicants from your region.

    Senior Data Engineering Analyst - Data Engineering

    Gurgaon, Haryana UnitedHealth Group

    Posted 2 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
    **Primary Responsibilities:**
    + Ensures that all the standard requirements have been met and is involved in performing the technical analysis
    + Assisting the project manager by compiling information from the current systems, analyzing the program requirements, and ensuring that it meets the specified time requirements
    + Resolves moderate problems associated with the designed programs and provides technical guidance on complex programming
    + Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
    **Required Qualifications:**
    + Graduate degree or equivalent experience
    + 4+ years of experience in database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL) development, storage engineering, data warehousing, data provisioning
    + Good hands-on experience in Azure, ADF and Databricks
    + Experience in RDBMS like Oracle, SQL Server, Oracle, DB2 etc.
    + Knowledge of Azure, ADF, Databricks, Scala, Airflow, DWH concepts
    + Good knowledge in Unix and shell scripting
    + Good understanding of Extract, Transform, and Load (ETL) Architecture, Cloud, Spark
    + Good understanding of Data Architecture
    + Understanding of the business needs and designing programs and systems that match the complex business requirements and records all the specifications that are involved in the development and coding process
    + Understanding of QA and testing automation process
    + Proven ability to participate in agile development projects for batch and real time data ingestion
    + Proven ability to work with business and peers to define, estimate, and deliver functionality
    + Proven ability to be involved in creating proper technical documentation in the work assignments
    **Preferred Qualifications:**
    + Knowledge of Agile, Automation, Big Data, DevOps, Python, - Scala/Pyspark programming, HDFS, Hive, Python, AI/ML
    + Understanding and knowledge of Agile
    _At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
    #Nic
    This advertiser has chosen not to accept applicants from your region.

    Data Engineering Lead

    Noida, Uttar Pradesh Microsoft Corporation

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    On Team Xbox, we aspire to empower the world's 3 billion gamers to play the games they want, with the people they want, anywhere they want. Gaming, the largest and fastest growing category in media & entertainment, represents an important growth opportunity for Microsoft. We are leading with innovation, as highlighted by bringing Xbox to new devices with Cloud Gaming, bringing the Game Pass subscription to PC, and our recent acquisition of Activision Blizzard King creating exciting new possibilities for players.
    The Xbox Experiences and Platforms team is home to the engineering work that makes this vision possible, building the developer tools and services that enable game creators to craft incredible experiences, the commerce systems that connect publishers with their audience and help gamers engage with their next favorite games, the platforms on which those games play at their best, and the experiences that turn every screen into an Xbox.
    **Responsibilities**
    Do you want to influence product engineering teams to shape the next generation of data and analytics capabilities for Xbox? The Xbox Plaform Data Intelligence Team is looking for a highly-motivated Data Engineer with data platform experience. You will work closely with engineering and product management in designing, implementing, and evolving innovative capabilities tailored to drive analytics and insights on engineering features. You will leverage core data pipelines to identify insights and experiment ideas that influence product decisions. Our capabilities influence data-driven decision making across Xbox Leadership, Finance, Business Planning, and Engineering teams.
    Collaboration, diversity, & self-direction are valued here. Expect to be given room and support to grow personally and professionally.
    Technically challenging projects, a healthy and high-caliber team, game-changing products for excited fans. don't miss this rewarding opportunity!
    **Responsibilities**
    + Work within and across teams to solve complex technical challenges
    + Develop engineering best-practices - continuously evaluate our processes and reporting to identify opportunities to improve, enhance, and automate existing and new capabilities with a fundamental understanding of the end-to-end scenario
    + Measure the success and usage patterns of the product / feature at various levels as well as key engineering metrics
    + Provide thought leadership, creation, and execution on data platform capabilities
    + Grow & foster an inclusive, creative, high-performance team culture
    + Coach & mentor other team members
    + Contribute to a data-driven culture as well as a culture of experimentation across the organization.
    **Qualifications**
    **Required:**
    + Bachelor's or Master's Degree in Computer Science, Mathematics, Software Engineering, Computer Engineering, or a related field, OR equivalent experience, with 8+ years of experience in business analytics, data science, software development, data modeling, or data engineering.
    + Experience working with cloud-based technologies, including relational databases, data warehouse, big data (e.g., Hadoop, Spark), orchestration/data pipeline tools, data lakes.
    + Self-motivated and organized to deliver results
    **Preferred:**
    + 1+ year(s) people management experience
    + Experience with Azure Analytics stack, e.g., Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Data Explorer (Kusto), Azure Cosmos DB, Azure logic apps, Fabric/Power BI
    + Experience in modern DevOps practices (including Git, CI/CD)
    + Good interpersonal and communications (verbal and written) skills, including the ability to effectively communicate with both business and technical teams.
    + Ability to use judgement and rating schemes to turn qualitative information into quantitative estimates
    + Proficiency in scenario analytics, mining for insights
    Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations ( .
    This advertiser has chosen not to accept applicants from your region.

    Data Engineering Consultant

    Bangalore, Karnataka UnitedHealth Group

    Posted 2 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
    **Primary Responsibilities:**
    + Build high performing and scalable data systems, applications, and data pipelines to process very large amounts of data from multiple sources
    + Develops services, controls, and reusable patterns that enable the team to deliver value safely, quickly, and sustainably in the public cloud and on prem
    + Collaborate on Big Data systems, and features within an Agile environment 
    + Collaborate with cross-function teams of developers, senior architects, product managers, DevOps, and project managers 
    + Deliver solutions that are devoid of significant security vulnerabilities 
    + Foster high-performance, collaborative technical work resulting in high-quality output
    + Proactively automate infrastructure, application and services to enable an automated delivery through the CI/CD pipelines to the cloud and on prem
    + Help convert current on prem ETL pipelines from .net to cloud using azure data factory/data bricks
    + Help maintain current .Net framework till conversions are complete
    + Display a solid desire to achieve and attain high levels of both internal and external customer satisfaction
    + Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
    **Required Qualifications:**
    + Graduate degree or equivalent work experience
    + 5+ years of experience in development, design, testing, and implementation of complex Oracle database programs in a distributed, service-based enterprise environment
    + 4+ years of hands-on Oracle PL/SQL development
    + 3+ years in software engineering, designing and building high-quality commercial applications
    + 2+ years in Azure/Cloud experience
    + 2+ years with CI/CD
    + 1+ years of practical experience with cloud providers (OpenShift, Kubernetes, AWS, Google Cloud, Azure)
    + Solid experience with Oracle functions, procedures, triggers, packages, and performance tuning
    + Knowledge of Technical assistance, problem resolution, and troubleshooting support
    + Familiarity with Agile/SAFE, Continuous Integration, Continuous Delivery, DevOps
    + Supporter of Open-Source software community
    + Background in Big Data solutions and NoSQL databases (preferred)
    + Expertise in data warehousing and business intelligence concepts
    + Proven analytical problem-solving approach
    + Proven excellent time management, communication, decision-making, and presentation skills
    + Demonstrated solid desire to achieve high levels of customer satisfaction
    + Demonstrated ability to operate in a rapidly changing environment and drive technological innovation
    + Demonstrated excellent time management, communication, decision making, and presentation skills
    + Proven track record of building relationships across cross-functional teams
    **Preferred Qualification:**
    + Experience with Java or open-source technologies, developing RESTful APIs
    _At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
    #Nic
    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Data team Jobs in India !

    Data Engineering Consultant

    Hyderabad, Andhra Pradesh UnitedHealth Group

    Posted 2 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
    **Primary Responsibilities:**
    + Participate in scrum process and deliver stories/features according to the schedule
    + Collaborate with team, architects and product stakeholders to understand the scopr and design of a deliverable
    + Participate in product support activities as needed by Team
    + Understand product architecture, features being built and come up with product improvement ideas and POCs
    + Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
    **Required Qualifications:**
    + Deep experience in Data analysis, including source data analysis, data profiling and mapping
    + Good experience in building data pipelines using ADF/Azure Databricks
    + Hands -on experience with a large scale data warehouse
    + Hands-on data migration experience from legacy systems to new solutions, such as on-premise clusters to cloud
    + Experience:
    + Experience on building ML models
    + DevOps, implementation of Bigdata, Apache Spark and Azure Cloud
    + Large scale data processing using PySpark on azure ecosystem
    + Implementation of self-service analytics platform ETL framework using PySpark on Azure
    + Good knowledge on Gen-AI, RAG's and LLM
    + Tools/Technologies:
    + Programming languages: Python, PySpark
    + Cloud technologies: Azure (ADF), Databricks, WebApp, Key vault, SQL Server, function app, logic app, Synapse, Azure Machine Learning, DevOps)
    + ML Models, GPT, NLP Algorithms
    + Expert skills in Azure data processing tools like Azure Data Factory, Azure Databricks
    + Solid proficiency in SQL and complex queries
    + Proven ability to leard and adapt to new data technologies
    + Proven good problem solving skills
    + Proven good communication skills
    **Preferred Qualifications:**
    + Knowledge on US healthcare industry/Pharmacy data
    + Knowledge or experience using Azure Synapse and Power BI
    _At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
    This advertiser has chosen not to accept applicants from your region.

    Data Engineering Consultant

    Bangalore, Karnataka UnitedHealth Group

    Posted 2 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
    **Primary Responsibilities:**
    + Provide database administration for mission-critical custom and packaged software applications
    + Design storage strategies around backup and recovery for complex database environments, physical structures, and specialized database applications - Enterprise Rapid Recovery
    + Uphold enterprise policy guidelines and recommend new/improved guidelines
    + Partner with project teams and interact with customers to find solutions for projects and operational issues for existing and proposed databases
    + Act as business liaison serving as primary point of contact between application business segments and database administrators
    + Demonstrate the knowledge and ability to perform in all of the basic database management skills of database administration, Web connectivity, physical structure, overall architecture, and database analysis
    + Provide standardization and consistency across environments
    + Ensure a stable and secure database environment
    + Apply database management consulting skills and gathers user requirements
    + Implement and monitor database functionality to ensure stable environments
    + Identify and initiate resolutions to user problems and concerns associated with database server equipment (hardware and software)
    + Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
    **Required Qualifications:**
    + Bachelor's degree in Computer Science or related field or equivalent work experience
    + 7+ years of Postgres DBA experience
    + 5+ years of Mysql database experience
    + 4+ years of experience in Azure Cloud
    + Hands-on experience in administering percona Xtradb cluster
    + Hands-on experience Mysql and Postgres replication tools
    + Experience in configuring, setting up REPMGR or similar technology for High availability solution
    + Solid knowledge of Mysql and postgres architecture and administration on both onprem and cloud platforms
    + Good understanding of PostgreSQL database architecture and repmgr cluster architecture in azure cloud
    + Expertise in Mysql and postgres database refresh and restore activities
    + Expertise in azure cloud architectur
    + Proven ability to install, monitor, and maintain PostgreSQL & MySQL software, implement monitoring and alertin
    + Proven ability to provide system and SQL performance tuning and assist in business process integration with various data sources
    + Proven ability to fulfill user requests ranging from access control, backup, restore, refresh to non-production to performance tuning
    + Proven ability to provide high availability and Disaster Recovery solutions
    **Preferred Qualifications:**
    + Experience in setting up PostgreSQL clusters and handling switchover/failover activities, knowledge on repmgr
    + Work experience in backup and recovery processes, database refresh between different environments, knowledge of percona Xtrabackup
    + Knowledge on oracle Database administration
    _At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
    This advertiser has chosen not to accept applicants from your region.

    Data Engineering Analyst

    Noida, Uttar Pradesh UnitedHealth Group

    Posted 2 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
    We are seeking a talented and motivated Data Engineer to join our growing data team. You will play a key role in building scalable data pipelines, optimizing data infrastructure, and enabling data-driven solutions.
    **Primary Responsibilities:**
    + Design, develop, and maintain scalable ETL/ELT pipelines for batch and real-time data processing
    + Build and optimize data models and data warehouses to support analytics and reporting
    + Collaborate with analysts and software engineers to deliver high-quality data solutions
    + Ensure data quality, integrity, and security across all systems
    + Monitor and troubleshoot data pipelines and infrastructure for performance and reliability
    + Contribute to internal tools and frameworks to improve data engineering workflows
    + Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
    **Required Qualifications:**
    + 5+ years of experience working on commercially available software and / or healthcare platforms as a Data Engineer
    + 3+ years of solid experience designing and building Enterprise Data solutions on cloud
    + 1+ years of experience developing solutions hosted within public cloud providers such as Azure or AWS or private cloud/container-based systems using Kubernetes/OpenShift
    + Experience with some of the modern relational databases
    + Experience with Data warehousing services preferably Snowflake
    + Experience in using modern software engineering and product development tools including Agile / SAFE, Continuous Integration, Continuous Delivery, DevOps etc.
    + Solid experience of operating in a quickly changing environment and driving technological innovation to meet business requirements
    + Skilled at optimizing SQL statements
    + Subject matter expert on Cloud technologies preferably Azure and Big Data ecosystem
    **Preferred Qualifications:**
    + Experience with real-time data streaming and event-driven architectures
    + Experience building Big Data solutions on public cloud (Azure)
    + Experience building data pipelines on Azure with skills Databricks spark, scala, Azure Data factory, Kafka and Kafka Streams, App services, Az Functions
    + Experience developing RESTful Services in .NET, Java or any other language
    + Experience with DevOps in Data engineering
    + Experience with Microservices architecture
    + Exposure to DevOps practices and infrastructure-as-code (e.g., Terraform, Docker)
    + Knowledge of data governance and data lineage tools
    + Ability to establish repeatable processes, best practices and implement version control software in a Cloud team environment
    _At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Data Team Jobs