Data Science & Engineering Intern

Delhi, Delhi January Capital

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

About the firm
At January Capital, we look to invest in technology businesses that will power growth in the Asia-Pacific region over the next decades. We are committed to investing in founders who challenge the status quo. We’re inspired by those individuals who are willing to dedicate themselves to solve some of the region’s biggest challenges. We manage more than US$300 million of assets on behalf of our investors. Our portfolio includes aCommerce, Akulaku, Shopback, Marqo, Glomopay, GO1, Tazapay and others. One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business.

One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business. As such we are looking for bright and motivated interns from Tier-1 Institutes (IITs, NITs, IIITs, BITS, ISI etc), to join us in the Data Science & Engineering Team.

About the role
Data is at the core of everything we do at January Capital – since our inception, we have leveraged proprietary datasets and tools with the goal of meeting the most promising founding teams at the earliest stage. As our ambition as a firm continues to grow, we are looking to further grow our capabilities in deploying AI driven data pipelines and workflows to expand on our existing data product strategy. To continue on this trajectory, we’re searching for interns to work closely with our existing data science & engineering team  investment team. This role will be based in India.

The focus is on building and scaling the data products, ML pipelines, and internal tooling that power an increasingly product-led investing platform. The role blends pragmatic engineering with applied data science. Interns work directly with the Data Science & Engineering team and partner closely with investment professionals to translate requirements into robust, production-ready solutions. High-performing interns may have the opportunity to convert to a full-time Analyst or Engineer role post-internship.

At a high level you will be:
Working with real-world data at scale
Building cutting-edge data pipelines
Applying ML/AI to solve impactful problems
Learning hands-on in a fast-paced environment

Ideally, we would like candidates to commit from 3-6 months on a full-time basis.

What you’ll bring:
Strong analytical foundation from an engineering, CS, economics, statistics, or related discipline, with comfort moving between exploratory analysis and production engineering.
Proficiency in Python and SQL; familiarity with ETL workflows, orchestration tools, Airflow/Prefect, DBT, Docker, and experience or interest in cloud platforms (primarily AWS) and emerging areas like Generative AI and AGentic AI will be an added advantage.
Curiosity for private markets and APAC technology ecosystems; ability to convert open-ended questions into measurable experiments.
Bias to ship: iterative mindset, clear communication, and ownership over outcomes.

A minimum of a bachelor’s degree in an analytical discipline (e.g., Computer Science, Engineering, Economics, Finance); candidates from Tier-1 institutes (IITs, NITs, IIITs, BITS, ISI, etc.) are strongly encouraged to apply.

Duration: 3–6 months
Location: Remote
Stipend: Competitive
This advertiser has chosen not to accept applicants from your region.

Data Science & Engineering Intern

Ghaziabad, Uttar Pradesh January Capital

Posted today

Job Viewed

Tap Again To Close

Job Description

About the firm

At January Capital, we look to invest in technology businesses that will power growth in the Asia-Pacific region over the next decades. We are committed to investing in founders who challenge the status quo. We’re inspired by those individuals who are willing to dedicate themselves to solve some of the region’s biggest challenges. We manage more than US$300 million of assets on behalf of our investors. Our portfolio includes aCommerce, Akulaku, Shopback, Marqo, Glomopay, GO1, Tazapay and others. One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business.


One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business. As such we are looking for bright and motivated interns from Tier-1 Institutes (IITs, NITs, IIITs, BITS, ISI etc), to join us in the Data Science & Engineering Team. 



About the role

Data is at the core of everything we do at January Capital – since our inception, we have leveraged proprietary datasets and tools with the goal of meeting the most promising founding teams at the earliest stage. As our ambition as a firm continues to grow, we are looking to further grow our capabilities in deploying AI driven data pipelines and workflows to expand on our existing data product strategy. To continue on this trajectory, we’re searching for interns to work closely with our existing data science & engineering team  investment team. This role will be based in India.


The focus is on building and scaling the data products, ML pipelines, and internal tooling that power an increasingly product-led investing platform. The role blends pragmatic engineering with applied data science. Interns work directly with the Data Science & Engineering team and partner closely with investment professionals to translate requirements into robust, production-ready solutions. High-performing interns may have the opportunity to convert to a full-time Analyst or Engineer role post-internship.


At a high level you will be:

  • Working with real-world data at scale 
  • Building cutting-edge data pipelines 
  • Applying ML/AI to solve impactful problems 
  • Learning hands-on in a fast-paced environment


Ideally, we would like candidates to commit from 3-6 months on a full-time basis.


What you’ll bring:

  • Strong analytical foundation from an engineering, CS, economics, statistics, or related discipline, with comfort moving between exploratory analysis and production engineering.
  • Proficiency in Python and SQL; familiarity with ETL workflows, orchestration tools, Airflow/Prefect, DBT, Docker, and experience or interest in cloud platforms (primarily AWS) and emerging areas like Generative AI and AGentic AI will be an added advantage.
  • Curiosity for private markets and APAC technology ecosystems; ability to convert open-ended questions into measurable experiments.
  • Bias to ship: iterative mindset, clear communication, and ownership over outcomes.


A minimum of a bachelor’s degree in an analytical discipline (e.g., Computer Science, Engineering, Economics, Finance); candidates from Tier-1 institutes (IITs, NITs, IIITs, BITS, ISI, etc.) are strongly encouraged to apply.


  • Duration: 3–6 months
  • Location: Remote
  • Stipend: Competitive


This advertiser has chosen not to accept applicants from your region.

Data Science & Engineering Intern

New Delhi, Delhi January Capital

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

About the firm

At January Capital, we look to invest in technology businesses that will power growth in the Asia-Pacific region over the next decades. We are committed to investing in founders who challenge the status quo. We’re inspired by those individuals who are willing to dedicate themselves to solve some of the region’s biggest challenges. We manage more than US$300 million of assets on behalf of our investors. Our portfolio includes aCommerce, Akulaku, Shopback, Marqo, Glomopay, GO1, Tazapay and others. One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business.


One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business. As such we are looking for bright and motivated interns from Tier-1 Institutes (IITs, NITs, IIITs, BITS, ISI etc), to join us in the Data Science & Engineering Team. 



About the role

Data is at the core of everything we do at January Capital – since our inception, we have leveraged proprietary datasets and tools with the goal of meeting the most promising founding teams at the earliest stage. As our ambition as a firm continues to grow, we are looking to further grow our capabilities in deploying AI driven data pipelines and workflows to expand on our existing data product strategy. To continue on this trajectory, we’re searching for interns to work closely with our existing data science & engineering team  investment team. This role will be based in India.


The focus is on building and scaling the data products, ML pipelines, and internal tooling that power an increasingly product-led investing platform. The role blends pragmatic engineering with applied data science. Interns work directly with the Data Science & Engineering team and partner closely with investment professionals to translate requirements into robust, production-ready solutions. High-performing interns may have the opportunity to convert to a full-time Analyst or Engineer role post-internship.


At a high level you will be:

  • Working with real-world data at scale 
  • Building cutting-edge data pipelines 
  • Applying ML/AI to solve impactful problems 
  • Learning hands-on in a fast-paced environment


Ideally, we would like candidates to commit from 3-6 months on a full-time basis.


What you’ll bring:

  • Strong analytical foundation from an engineering, CS, economics, statistics, or related discipline, with comfort moving between exploratory analysis and production engineering.
  • Proficiency in Python and SQL; familiarity with ETL workflows, orchestration tools, Airflow/Prefect, DBT, Docker, and experience or interest in cloud platforms (primarily AWS) and emerging areas like Generative AI and AGentic AI will be an added advantage.
  • Curiosity for private markets and APAC technology ecosystems; ability to convert open-ended questions into measurable experiments.
  • Bias to ship: iterative mindset, clear communication, and ownership over outcomes.


A minimum of a bachelor’s degree in an analytical discipline (e.g., Computer Science, Engineering, Economics, Finance); candidates from Tier-1 institutes (IITs, NITs, IIITs, BITS, ISI, etc.) are strongly encouraged to apply.


  • Duration: 3–6 months
  • Location: Remote
  • Stipend: Competitive


This advertiser has chosen not to accept applicants from your region.

Data Science & Engineering Intern

Faridabad, Haryana January Capital

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

About the firm

At January Capital, we look to invest in technology businesses that will power growth in the Asia-Pacific region over the next decades. We are committed to investing in founders who challenge the status quo. We’re inspired by those individuals who are willing to dedicate themselves to solve some of the region’s biggest challenges. We manage more than US$300 million of assets on behalf of our investors. Our portfolio includes aCommerce, Akulaku, Shopback, Marqo, Glomopay, GO1, Tazapay and others. One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business.


One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business. As such we are looking for bright and motivated interns from Tier-1 Institutes (IITs, NITs, IIITs, BITS, ISI etc), to join us in the Data Science & Engineering Team. 



About the role

Data is at the core of everything we do at January Capital – since our inception, we have leveraged proprietary datasets and tools with the goal of meeting the most promising founding teams at the earliest stage. As our ambition as a firm continues to grow, we are looking to further grow our capabilities in deploying AI driven data pipelines and workflows to expand on our existing data product strategy. To continue on this trajectory, we’re searching for interns to work closely with our existing data science & engineering team  investment team. This role will be based in India.


The focus is on building and scaling the data products, ML pipelines, and internal tooling that power an increasingly product-led investing platform. The role blends pragmatic engineering with applied data science. Interns work directly with the Data Science & Engineering team and partner closely with investment professionals to translate requirements into robust, production-ready solutions. High-performing interns may have the opportunity to convert to a full-time Analyst or Engineer role post-internship.


At a high level you will be:

  • Working with real-world data at scale 
  • Building cutting-edge data pipelines 
  • Applying ML/AI to solve impactful problems 
  • Learning hands-on in a fast-paced environment


Ideally, we would like candidates to commit from 3-6 months on a full-time basis.


What you’ll bring:

  • Strong analytical foundation from an engineering, CS, economics, statistics, or related discipline, with comfort moving between exploratory analysis and production engineering.
  • Proficiency in Python and SQL; familiarity with ETL workflows, orchestration tools, Airflow/Prefect, DBT, Docker, and experience or interest in cloud platforms (primarily AWS) and emerging areas like Generative AI and AGentic AI will be an added advantage.
  • Curiosity for private markets and APAC technology ecosystems; ability to convert open-ended questions into measurable experiments.
  • Bias to ship: iterative mindset, clear communication, and ownership over outcomes.


A minimum of a bachelor’s degree in an analytical discipline (e.g., Computer Science, Engineering, Economics, Finance); candidates from Tier-1 institutes (IITs, NITs, IIITs, BITS, ISI, etc.) are strongly encouraged to apply.


  • Duration: 3–6 months
  • Location: Remote
  • Stipend: Competitive


This advertiser has chosen not to accept applicants from your region.

Data Science & Engineering Intern

Delhi, Delhi January Capital

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

About the firm

At January Capital, we look to invest in technology businesses that will power growth in the Asia-Pacific region over the next decades. We are committed to investing in founders who challenge the status quo. We’re inspired by those individuals who are willing to dedicate themselves to solve some of the region’s biggest challenges. We manage more than US$300 million of assets on behalf of our investors. Our portfolio includes aCommerce, Akulaku, Shopback, Marqo, Glomopay, GO1, Tazapay and others. One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business.


One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business. As such we are looking for bright and motivated interns from Tier-1 Institutes (IITs, NITs, IIITs, BITS, ISI etc), to join us in the Data Science & Engineering Team. 



About the role

Data is at the core of everything we do at January Capital – since our inception, we have leveraged proprietary datasets and tools with the goal of meeting the most promising founding teams at the earliest stage. As our ambition as a firm continues to grow, we are looking to further grow our capabilities in deploying AI driven data pipelines and workflows to expand on our existing data product strategy. To continue on this trajectory, we’re searching for interns to work closely with our existing data science & engineering team  investment team. This role will be based in India.


The focus is on building and scaling the data products, ML pipelines, and internal tooling that power an increasingly product-led investing platform. The role blends pragmatic engineering with applied data science. Interns work directly with the Data Science & Engineering team and partner closely with investment professionals to translate requirements into robust, production-ready solutions. High-performing interns may have the opportunity to convert to a full-time Analyst or Engineer role post-internship.


At a high level you will be:

  • Working with real-world data at scale 
  • Building cutting-edge data pipelines 
  • Applying ML/AI to solve impactful problems 
  • Learning hands-on in a fast-paced environment


Ideally, we would like candidates to commit from 3-6 months on a full-time basis.


What you’ll bring:

  • Strong analytical foundation from an engineering, CS, economics, statistics, or related discipline, with comfort moving between exploratory analysis and production engineering.
  • Proficiency in Python and SQL; familiarity with ETL workflows, orchestration tools, Airflow/Prefect, DBT, Docker, and experience or interest in cloud platforms (primarily AWS) and emerging areas like Generative AI and AGentic AI will be an added advantage.
  • Curiosity for private markets and APAC technology ecosystems; ability to convert open-ended questions into measurable experiments.
  • Bias to ship: iterative mindset, clear communication, and ownership over outcomes.


A minimum of a bachelor’s degree in an analytical discipline (e.g., Computer Science, Engineering, Economics, Finance); candidates from Tier-1 institutes (IITs, NITs, IIITs, BITS, ISI, etc.) are strongly encouraged to apply.


  • Duration: 3–6 months
  • Location: Remote
  • Stipend: Competitive


This advertiser has chosen not to accept applicants from your region.

Data Science & Engineering Intern

Noida, Uttar Pradesh January Capital

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

About the firm

At January Capital, we look to invest in technology businesses that will power growth in the Asia-Pacific region over the next decades. We are committed to investing in founders who challenge the status quo. We’re inspired by those individuals who are willing to dedicate themselves to solve some of the region’s biggest challenges. We manage more than US$300 million of assets on behalf of our investors. Our portfolio includes aCommerce, Akulaku, Shopback, Marqo, Glomopay, GO1, Tazapay and others. One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business.


One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business. As such we are looking for bright and motivated interns from Tier-1 Institutes (IITs, NITs, IIITs, BITS, ISI etc), to join us in the Data Science & Engineering Team. 



About the role

Data is at the core of everything we do at January Capital – since our inception, we have leveraged proprietary datasets and tools with the goal of meeting the most promising founding teams at the earliest stage. As our ambition as a firm continues to grow, we are looking to further grow our capabilities in deploying AI driven data pipelines and workflows to expand on our existing data product strategy. To continue on this trajectory, we’re searching for interns to work closely with our existing data science & engineering team  investment team. This role will be based in India.


The focus is on building and scaling the data products, ML pipelines, and internal tooling that power an increasingly product-led investing platform. The role blends pragmatic engineering with applied data science. Interns work directly with the Data Science & Engineering team and partner closely with investment professionals to translate requirements into robust, production-ready solutions. High-performing interns may have the opportunity to convert to a full-time Analyst or Engineer role post-internship.


At a high level you will be:

  • Working with real-world data at scale 
  • Building cutting-edge data pipelines 
  • Applying ML/AI to solve impactful problems 
  • Learning hands-on in a fast-paced environment


Ideally, we would like candidates to commit from 3-6 months on a full-time basis.


What you’ll bring:

  • Strong analytical foundation from an engineering, CS, economics, statistics, or related discipline, with comfort moving between exploratory analysis and production engineering.
  • Proficiency in Python and SQL; familiarity with ETL workflows, orchestration tools, Airflow/Prefect, DBT, Docker, and experience or interest in cloud platforms (primarily AWS) and emerging areas like Generative AI and AGentic AI will be an added advantage.
  • Curiosity for private markets and APAC technology ecosystems; ability to convert open-ended questions into measurable experiments.
  • Bias to ship: iterative mindset, clear communication, and ownership over outcomes.


A minimum of a bachelor’s degree in an analytical discipline (e.g., Computer Science, Engineering, Economics, Finance); candidates from Tier-1 institutes (IITs, NITs, IIITs, BITS, ISI, etc.) are strongly encouraged to apply.


  • Duration: 3–6 months
  • Location: Remote
  • Stipend: Competitive


This advertiser has chosen not to accept applicants from your region.

Data Science & Engineering Intern

Ghaziabad, Uttar Pradesh January Capital

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

About the firm

At January Capital, we look to invest in technology businesses that will power growth in the Asia-Pacific region over the next decades. We are committed to investing in founders who challenge the status quo. We’re inspired by those individuals who are willing to dedicate themselves to solve some of the region’s biggest challenges. We manage more than US$300 million of assets on behalf of our investors. Our portfolio includes aCommerce, Akulaku, Shopback, Marqo, Glomopay, GO1, Tazapay and others. One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business.


One of the differentiators of our firm is that we are building out our ambitions to be a truly product led investment firm and in line with that we build out our own data products, utilize machine learning techniques and deploy workflow automations across all facets of our business. As such we are looking for bright and motivated interns from Tier-1 Institutes (IITs, NITs, IIITs, BITS, ISI etc), to join us in the Data Science & Engineering Team. 



About the role

Data is at the core of everything we do at January Capital – since our inception, we have leveraged proprietary datasets and tools with the goal of meeting the most promising founding teams at the earliest stage. As our ambition as a firm continues to grow, we are looking to further grow our capabilities in deploying AI driven data pipelines and workflows to expand on our existing data product strategy. To continue on this trajectory, we’re searching for interns to work closely with our existing data science & engineering team  investment team. This role will be based in India.


The focus is on building and scaling the data products, ML pipelines, and internal tooling that power an increasingly product-led investing platform. The role blends pragmatic engineering with applied data science. Interns work directly with the Data Science & Engineering team and partner closely with investment professionals to translate requirements into robust, production-ready solutions. High-performing interns may have the opportunity to convert to a full-time Analyst or Engineer role post-internship.


At a high level you will be:

  • Working with real-world data at scale 
  • Building cutting-edge data pipelines 
  • Applying ML/AI to solve impactful problems 
  • Learning hands-on in a fast-paced environment


Ideally, we would like candidates to commit from 3-6 months on a full-time basis.


What you’ll bring:

  • Strong analytical foundation from an engineering, CS, economics, statistics, or related discipline, with comfort moving between exploratory analysis and production engineering.
  • Proficiency in Python and SQL; familiarity with ETL workflows, orchestration tools, Airflow/Prefect, DBT, Docker, and experience or interest in cloud platforms (primarily AWS) and emerging areas like Generative AI and AGentic AI will be an added advantage.
  • Curiosity for private markets and APAC technology ecosystems; ability to convert open-ended questions into measurable experiments.
  • Bias to ship: iterative mindset, clear communication, and ownership over outcomes.


A minimum of a bachelor’s degree in an analytical discipline (e.g., Computer Science, Engineering, Economics, Finance); candidates from Tier-1 institutes (IITs, NITs, IIITs, BITS, ISI, etc.) are strongly encouraged to apply.


  • Duration: 3–6 months
  • Location: Remote
  • Stipend: Competitive


This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data preprocessing Jobs in Noida !

Data Engineering Lead

Noida, Uttar Pradesh Microsoft Corporation

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

On Team Xbox, we aspire to empower the world's 3 billion gamers to play the games they want, with the people they want, anywhere they want. Gaming, the largest and fastest growing category in media & entertainment, represents an important growth opportunity for Microsoft. We are leading with innovation, as highlighted by bringing Xbox to new devices with Cloud Gaming, bringing the Game Pass subscription to PC, and our recent acquisition of Activision Blizzard King creating exciting new possibilities for players.
The Xbox Experiences and Platforms team is home to the engineering work that makes this vision possible, building the developer tools and services that enable game creators to craft incredible experiences, the commerce systems that connect publishers with their audience and help gamers engage with their next favorite games, the platforms on which those games play at their best, and the experiences that turn every screen into an Xbox.
**Responsibilities**
Do you want to influence product engineering teams to shape the next generation of data and analytics capabilities for Xbox? The Xbox Plaform Data Intelligence Team is looking for a highly-motivated Data Engineer with data platform experience. You will work closely with engineering and product management in designing, implementing, and evolving innovative capabilities tailored to drive analytics and insights on engineering features. You will leverage core data pipelines to identify insights and experiment ideas that influence product decisions. Our capabilities influence data-driven decision making across Xbox Leadership, Finance, Business Planning, and Engineering teams.
Collaboration, diversity, & self-direction are valued here. Expect to be given room and support to grow personally and professionally.
Technically challenging projects, a healthy and high-caliber team, game-changing products for excited fans. don't miss this rewarding opportunity!
**Responsibilities**
+ Work within and across teams to solve complex technical challenges
+ Develop engineering best-practices - continuously evaluate our processes and reporting to identify opportunities to improve, enhance, and automate existing and new capabilities with a fundamental understanding of the end-to-end scenario
+ Measure the success and usage patterns of the product / feature at various levels as well as key engineering metrics
+ Provide thought leadership, creation, and execution on data platform capabilities
+ Grow & foster an inclusive, creative, high-performance team culture
+ Coach & mentor other team members
+ Contribute to a data-driven culture as well as a culture of experimentation across the organization.
**Qualifications**
**Required:**
+ Bachelor's or Master's Degree in Computer Science, Mathematics, Software Engineering, Computer Engineering, or a related field, OR equivalent experience, with 8+ years of experience in business analytics, data science, software development, data modeling, or data engineering.
+ Experience working with cloud-based technologies, including relational databases, data warehouse, big data (e.g., Hadoop, Spark), orchestration/data pipeline tools, data lakes.
+ Self-motivated and organized to deliver results
**Preferred:**
+ 1+ year(s) people management experience
+ Experience with Azure Analytics stack, e.g., Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Data Explorer (Kusto), Azure Cosmos DB, Azure logic apps, Fabric/Power BI
+ Experience in modern DevOps practices (including Git, CI/CD)
+ Good interpersonal and communications (verbal and written) skills, including the ability to effectively communicate with both business and technical teams.
+ Ability to use judgement and rating schemes to turn qualitative information into quantitative estimates
+ Proficiency in scenario analytics, mining for insights
Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations ( .
This advertiser has chosen not to accept applicants from your region.

Data Engineering Analyst

Noida, Uttar Pradesh UnitedHealth Group

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
We are seeking a talented and motivated Data Engineer to join our growing data team. You will play a key role in building scalable data pipelines, optimizing data infrastructure, and enabling data-driven solutions.
**Primary Responsibilities:**
+ Design, develop, and maintain scalable ETL/ELT pipelines for batch and real-time data processing
+ Build and optimize data models and data warehouses to support analytics and reporting
+ Collaborate with analysts and software engineers to deliver high-quality data solutions
+ Ensure data quality, integrity, and security across all systems
+ Monitor and troubleshoot data pipelines and infrastructure for performance and reliability
+ Contribute to internal tools and frameworks to improve data engineering workflows
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ 5+ years of experience working on commercially available software and / or healthcare platforms as a Data Engineer
+ 3+ years of solid experience designing and building Enterprise Data solutions on cloud
+ 1+ years of experience developing solutions hosted within public cloud providers such as Azure or AWS or private cloud/container-based systems using Kubernetes/OpenShift
+ Experience with some of the modern relational databases
+ Experience with Data warehousing services preferably Snowflake
+ Experience in using modern software engineering and product development tools including Agile / SAFE, Continuous Integration, Continuous Delivery, DevOps etc.
+ Solid experience of operating in a quickly changing environment and driving technological innovation to meet business requirements
+ Skilled at optimizing SQL statements
+ Subject matter expert on Cloud technologies preferably Azure and Big Data ecosystem
**Preferred Qualifications:**
+ Experience with real-time data streaming and event-driven architectures
+ Experience building Big Data solutions on public cloud (Azure)
+ Experience building data pipelines on Azure with skills Databricks spark, scala, Azure Data factory, Kafka and Kafka Streams, App services, Az Functions
+ Experience developing RESTful Services in .NET, Java or any other language
+ Experience with DevOps in Data engineering
+ Experience with Microservices architecture
+ Exposure to DevOps practices and infrastructure-as-code (e.g., Terraform, Docker)
+ Knowledge of data governance and data lineage tools
+ Ability to establish repeatable processes, best practices and implement version control software in a Cloud team environment
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
This advertiser has chosen not to accept applicants from your region.

Data Engineering Consultant

Noida, Uttar Pradesh UnitedHealth Group

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start **Caring. Connecting. Growing together.**
**Primary Responsibilities:**
+ Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure
+ Design and develop Azure Databricks processes using PySpark/Spark-SQL
+ Design and develop orchestration jobs using ADF, Databricks Workflow
+ Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve
+ Building test framework for the Databricks notebook jobs for automated testing before code deployment
+ Design and build POCs to validate new ideas, tools, and architectures in Azure
+ Continuously explore new Azure services and capabilities; assess their applicability to business needs
+ Prepare case studies and technical write-ups to showcase successful implementations and lessons learned
+ Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture
+ Contribute to full lifecycle project implementations, from design and development to deployment and monitoring
+ Ensure solutions adhere to security, compliance, and governance standards
+ Monitor and optimize data pipelines and cloud resources for cost and performance efficiency
+ Identifies solutions to non-standard requests and problems
+ Mentor and support existing on-prem developers for cloud environment
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ 7+ years of overall experience in Data & Analytics engineering
+ 5+ years of experience working with Azure, Databricks, and ADF, Data Lake
+ 5+ years of experience working with data platform or product using PySpark and Spark-SQL
+ Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc.
+ In-depth understanding of Azure architecture & ability to come up with efficient design & solutions
+ Highly proficient in Python and SQL
+ Proven excellent communication skills
+ **Key Skill:** Azure Data Engineer - Azure Databricks, Azure Data factory, Python/Pyspark, Terraform
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Preprocessing Jobs View All Jobs in Noida