206 Data Engineering jobs in India
Big Data Engineering Vice president

Posted 1 day ago
Job Viewed
Job Description
Responsibilities:
- Manage one or more Applications Development teams in an effort to accomplish established goals
- Utilize in-depth knowledge and skills across multiple Applications Development areas to provide technical oversight across systems and applications
- Review/analyze/develop proposed technical solutions for projects
- Contribute to formulation of strategies for applications development and other functional areas
- Develop comprehensive knowledge of how areas of business integrate to accomplish business goals
- Provide evaluative judgment based on analysis of factual data in complicated and unique situations
- Impact the Applications Development area through monitoring delivery of end results, participate in budget management, and handling day-to-day staff management issues, including resource management and allocation of work within the team/project
- Ensure essential procedures are followed and contribute to defining standards negotiating with external parties when necessary
- Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards.
Core Skills:
The Data Engineering lead will be working very closely with and managing the work of a team of data engineers working on our Big Data Platform. The lead will need the below core skills -
- Strong solid understanding of the Big Data architecture and the ability to trouble shoot performance and/or development issues on Hadoop (Cloudera preferably)
- Hands-on experience working with Hive, Impala, Kafka, HBase, Spark for data curation/conformance related work.
- Strong proficiency in Spark for development work related to curation/conformance. Strong Scala development (with previous Java background) preferred.
- Experience with Spark/Kafka or equivalent streaming/batch processing and event-based messaging.
- Strong data analysis skills and the ability to slice and dice the data as needed for business reporting.
- Relational SQL (Oracle, SQL Server), NoSQL (MongoDB) and Cache (Couchbase) database integration and data distribution principles experience
- Leadership & Mentorship: Ability to guide and mentor junior developers, fostering a collaborative team environment and promoting professional growth.
- Communication Skills: Strong communication skills, both written and verbal, with the ability to explain complex technical concepts to both technical and non-technical audiences.
- DevOps Practices: Experience working in a Continuous Integration and Continuous Delivery environment and familiarity with tools like Jenkins, TeamCity, SonarQube, OpenShift, ECS, or Kubernetes.
- Software Engineering Principles: Proficient in industry-standard best practices such as Design Patterns, Coding Standards, Coding modularity, and Prototyping.
- Data Visualization: Experience with data visualization tools and techniques for presenting data insights effectively.
- Agile Methodologies: Familiarity with agile development methodologies and experience working in agile and scaled agile teams.
Additional Requirements (Nice to have):
- Cloudera/Hortonworks/AWS EMR, S3 experience a plus
- Experience with Cloud Integration on AWS, Snowflake or GCP tech stack components.
- Experience with API development and use of JSON/XML/Hypermedia data formats.
- Analysis and development across Lines of business product/function including Payments, Digital Channels, Liquidities, Trade, Sales, Pricing, Client Experience having Cross train, functional and/or technical knowledge
Qualifications:
- 12+ Years of relevant experience in the Big Data Application Development
- Experience as Applications Development Manager
- Experience as senior level in an Applications Development role
- Stakeholder and people management experience
- Demonstrated leadership skills
- Proven project management skills
- Basic knowledge of industry practices and standards
Education:
- Bachelor's degree/University degree or equivalent experience
- Master's degree preferred
Other Relevant Skills
Apache Hadoop, Apache Hive, Apache Impala, Apache Kafka, Apache Spark, Big Data, Java, MongoDB, NoSQL, Oracle Database, Python (Programming Language), Relational Database Management System (RDBMS).
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
---
**Job Family Group:**
Technology
---
**Job Family:**
Applications Development
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
Data Engineering
Posted 2 days ago
Job Viewed
Job Description
Responsibilities:
- Work with stakeholders to understand the data requirements to design, develop, and maintain complex ETL processes.
- Create the data integration and data diagram documentation.
- Lead the data validation, UAT and regression test for new data asset creation.
- Create and maintain data models, including schema design and optimization.
- Create and manage data pipelines that automate the flow of data, ensuring data quality and consistency.
Qualifications and Skills:
- Strong knowledge on Python and Pyspark
- Expectation is to have ability to write Pyspark scripts for developing data workflows.
- Strong knowledge on SQL, Hadoop, Hive, Azure, Databricks and Greenplum
- Expectation is to write SQL to query metadata and tables from different data management system such as, Oracle, Hive, Databricks and Greenplum.
- Familiarity with big data technologies like Hadoop, Spark, and distributed computing frameworks.
- Expectation is to use Hue and run Hive SQL queries, schedule Apache Oozie jobs to automate the data workflows.
- Good working experience of communicating with the stakeholders and collaborate effectively with the business team for data testing.
- Expectation is to have strong problem-solving and troubleshooting skills.
- Expectation is to establish comprehensive data quality test cases, procedures and implement automated data validation processes.
- Degree in Data Science, Statistics, Computer Science or other related fields or an equivalent combination of education and experience.
- 3-7 years of experience in Data Engineer.
- Proficiency in programming languages commonly used in data engineering, such as Python, Pyspark, SQL.
- Experience in Azure cloud computing platform, such as developing ETL processes using Azure Data Factory, big data processing and analytics with Azure Databricks.
- Strong communication, problem solving and analytical skills with the ability to do time management and multi-tasking with attention to detail and accuracy.
Data Engineering Manager

Posted 1 day ago
Job Viewed
Job Description
At Amgen, if you feel like you're part of something bigger, it's because you are. Our shared mission-to serve patients living with serious illnesses-drives all that we do.
Since 1980, we've helped pioneer the world of biotech in our fight against the world's toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you'll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives.
Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you'll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career.
**Data Engineering Manager**
**What you will do**
Let's do this. Let's change the world. In this vital role you will lead a team of data engineers to build, optimize, and maintain scalable data architectures, data pipelines, and operational frameworks that support real-time analytics, AI-driven insights, and enterprise-wide data solutions. As a strategic leader, the ideal candidate will drive best practices in data engineering, cloud technologies, and Agile development, ensuring robust governance, data quality, and efficiency. The role requires technical expertise, team leadership, and a deep understanding of cloud data solutions to optimize data-driven decision-making.
+ Lead and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous learning for solving complex problems of R&D division.
+ Oversee the development of data extraction, validation, and transformation techniques, ensuring ingested data is of high quality and compatible with downstream systems.
+ Guide the team in writing and validating high-quality code for data ingestion, processing, and transformation, ensuring resiliency and fault tolerance.
+ Drive the development of data tools and frameworks for running and accessing data efficiently across the organization.
+ Oversee the implementation of performance monitoring protocols across data pipelines, ensuring real-time visibility, alerts, and automated recovery mechanisms.
+ Coach engineers in building dashboards and aggregations to monitor pipeline health and detect inefficiencies, ensuring optimal performance and cost-effectiveness.
+ Lead the implementation of self-healing solutions, reducing failure points and improving pipeline stability and efficiency across multiple product features.
+ Oversee data governance strategies, ensuring compliance with security policies, regulations, and data accessibility best practices.
+ Guide engineers in data modeling, metadata management, and access control, ensuring structured data handling across various business use cases.
+ Collaborate with business leaders, product owners, and cross-functional teams to ensure alignment of data architecture with product requirements and business objectives.
+ Prepare team members for key partner discussions by helping assess data costs, access requirements, dependencies, and availability for business scenarios.
+ Drive Agile and Scaled Agile (SAFe) methodologies, handling sprint backlogs, prioritization, and iterative improvements to enhance team velocity and project delivery.
+ Stay up-to-date with emerging data technologies, industry trends, and best practices, ensuring the organization uses the latest innovations in data engineering and architecture.
**What we expect of you**
We are all different, yet we all use our unique contributions to serve patients. We are seeking a seasoned Engineering Manager (Data Engineering) to drive the development and implementation of our data strategy with deep expertise in R&D of Biotech or Pharma domain.
**Basic Qualifications:**
+ Doctorate degree **OR**
+ Master's degree and 4 to 6 years of experience in Computer Science, IT or related field **OR**
+ Bachelor's degree and 6 to 8 years of experience in Computer Science, IT or related field **OR**
+ Diploma and 10 to 12 years of experience in Computer Science, IT or related field
+ Experience leading a team of data engineers in the R&D domain of biotech/pharma companies.
+ Experience architecting and building data and analytics solutions that extract, transform, and load data from multiple source systems.
+ Data Engineering experience in R&D for Biotechnology or pharma industry
+ Demonstrated hands-on experience with cloud platforms (AWS) and the ability to architect cost-effective and scalable data solutions.
+ Proficiency in Python, PySpark, SQL.
+ Experience with dimensional data modeling.
+ Experience working with Apache Spark, Apache Airflow.
+ Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops.
+ Experienced with AWS or GCP or Azure cloud services.
+ Understanding of end to end project/product life cycle
+ Well versed with full stack development & DataOps automation, logging frameworks, and pipeline orchestration tools.
+ Strong analytical and problem-solving skills to address complex data challenges.
+ Effective communication and interpersonal skills to collaborate with cross-functional teams.
**Preferred Qualifications:**
+ AWS Certified Data Engineer preferred
+ Databricks Certificate preferred
+ Scaled Agile SAFe certification preferred
+ Project Management certifications preferred
+ Data Engineering Management experience in Biotech/Pharma is a plus
+ Experience using graph databases such as Stardog or Marklogic or Neo4J or Allegrograph, etc.
**Soft Skills:**
+ Excellent analytical and troubleshooting skills
+ Strong verbal and written communication skills
+ Ability to work effectively with global, virtual teams
+ High degree of initiative and self-motivation
+ Ability to handle multiple priorities successfully
+ Team-oriented, with a focus on achieving team goals
+ Strong presentation and public speaking skills
**What you can expect of us**
As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we'll support your journey every step of the way.
In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
**Apply now and make a lasting impact with the Amgen team.**
**careers.amgen.com**
As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease.
Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Data Engineering manager

Posted 1 day ago
Job Viewed
Job Description
Who We Are:
Ever wonder who brings the entertainment to your flights? Panasonic Avionics Corporation is #1 in the industry for delivering inflight products such as movies, games, WiFi, and now Bluetooth headphone connectivity!
How exciting would it be to be a part of the innovation that goes into creating technology that delights millions of people in an industry that's here to stay! With our company's history spanning over 40 years, you will have stability, career growth opportunities, and will work with the brightest minds in the industry. And we are committed to a diverse and inclusive culture that will help our organization thrive! We seek diversity in many areas such as background, culture, gender, ways of thinking, skills and more.
If you want to learn more about us visit us at ( . And for a full listing of open job opportunities go to (
**The Position:**
We are seeking a proven Data Engineering Leader to drive the design, development, and deployment of scalable, secure, and high-performance data solutions. This role will lead high-performing teams, architect cloud-native data platforms, and collaborate closely with business, AI/ML, and BI teams to deliver end-to-end data products that power innovation and strategic decision-making.
The position offers the opportunity to shape data architecture strategy, establish best practices in Lakehouse and streaming solutions, and enable advanced analytics and AI/ML at scale.
**Responsibilities**
**What We're Looking For:**
+ Proven leadership in building and mentoring high-performing data engineering teams.
+ Expertise in architecting cloud-native data platforms on AWS, leveraging services such as EMR, EKS, Glue, Redshift, S3, Lambda, and SageMaker.
+ Strong background in Lakehouse architecture (Glue Catalog, Iceberg, Delta Lake) and distributed processing frameworks (Spark, Hive, Presto).
+ Experience with real-time streaming solutions (Kafka, Kinesis, Flink).
+ Proficiency in orchestrating complex data workflows with Apache Airflow.
+ Hands-on experience with GitLab CI/CD, Terraform, CloudFormation Templates, and Infrastructure-as-Code.
+ Strong understanding of MDM strategies and data governance best practices (GDPR, HIPAA, etc.).
+ Ability to design and develop middleware APIs (REST) to seamlessly integrate data pipelines with applications and analytics platforms.
+ Experience supporting AI/ML teams with feature engineering, training, and deployment pipelines using SageMaker.
+ Solid knowledge of SQL & NoSQL databases (Redshift, DynamoDB, PostgreSQL, Elasticsearch).
+ Familiarity with BI enablement and data modeling for visualization platforms like Amazon QuickSight.
+ In-depth knowledge of security best practices in AWS-based data architectures.
+ Demonstrated success in driving AI/ML initiatives from ideation to production.
**Our Principles:** ** **
Contribution to Society | Fairness & Honesty | Cooperation & Team Spirit | Untiring Effort for Improvement | Courtesy & Humility | Adaptability | Gratitude
**What We Offer:** ** **
At Panasonic Avionics Corporation we realize the most important aspects in leading our industry are the bright minds behind everything we do. We are proud to offer our employees a highly competitive, comprehensive and flexible benefits program.
**Qualifications**
**Educational Background:**
+ Bachelor's degree or higher in Computer Science, Data Engineering, Aerospace Engineering, or a related field.
+ Advanced degrees (Master's/PhD) in Data Science or AI/ML are a plus.
REQ-
Technologist, Data Engineering
Posted today
Job Viewed
Job Description
Sandisk understands how people and businesses consume data and we relentlessly innovate to deliver solutions that enable today's needs and tomorrow's next big ideas. With a rich history of groundbreaking innovations in Flash and advanced memory technologies, our solutions have become the beating heart of the digital world we're living in and that we have the power to shape.
Sandisk meets people and businesses at the intersection of their aspirations and the moment, enabling them to keep moving and pushing possibility forward. We do this through the balance of our powerhouse manufacturing capabilities and our industry-leading portfolio of products that are recognized globally for innovation, performance and quality.
Sandisk has two facilities recognized by the World Economic Forum as part of the Global Lighthouse Network for advanced 4IR innovations. These facilities were also recognized as Sustainability Lighthouses for breakthroughs in efficient operations. With our global reach, we ensure the global supply chain has access to the Flash memory it needs to keep our world moving forward.
**Job Description**
We are seeking a highly experienced and visionary **Senior Data Scientist** at the **Technologist level** to lead strategic AI/ML and GenAI initiatives. This role demands deep technical expertise, leadership in complex projects, and a passion for innovation in data science and advanced analytics.
**Key Responsibilities**
+ Lead the **end-to-end data science lifecycle** : problem definition, data acquisition, model development, deployment, and monitoring.
+ Architect and implement **scalable AI/ML solutions** using modern frameworks, cloud platforms, and MLOps best practices.
+ Drive **GenAI initiatives** including fine-tuning, prompt engineering, and integration into enterprise applications.
+ Provide **strategic direction and thought leadership** on advanced analytics adoption across the business.
+ Mentor, coach, and upskill a team of data scientists and engineers; foster a culture of **innovation and collaboration** .
+ Partner with cross-functional teams (engineering, product, factory operations, IT) to translate business needs into data-driven solutions.
+ Ensure model **robustness, fairness, interpretability** , and compliance with ethical AI standards.
+ Design and oversee **experimentation frameworks** (A/B testing, causal inference, statistical modeling) for data-driven decision making.
+ Stay ahead of **emerging trends** in AI, ML, and big data technologies; evaluate their potential for business impact.
+ Present insights, models, and strategies to **senior leadership and non-technical stakeholders** in clear, actionable terms.
**Qualifications**
+ MS/ME/MTech/PhD in Data Science, Statistics, Computer Science, or related fields.
+ ~15 years of experience in data science, AI/ML, or advanced analytics, including leadership in complex projects.
+ Proven expertise in:
+ Machine Learning, Deep Learning, and Statistical Modeling
+ Optimization techniques for solving complex, high-dimensional problems.
+ GenAI applications including architectures like RAG, fine-tuning, and LLMOps.
+ Synthetic data generation and handling highly imbalanced and high-volume datasets.
+ GenAI applications including architectures like RAG, fine-tuning, and LLMOps.
+ Experience with **anomaly detection, pretrained transformers** , and **custom embedding models** .
+ Strong proficiency in **Python and SQL** for data wrangling, analysis, and modeling.
+ Hands-on experience with **TensorFlow, PyTorch, Pyspark** , and related AI/ML frameworks.
+ Deep understanding of **Big Data platforms** (e.g., Spark, Hadoop, distributed databases, cloud data warehouses).
+ Experience in **MLOps** : model deployment, monitoring, versioning, and lifecycle management.
+ Strong knowledge of **data architecture, pipelines** , and feature engineering at scale.
+ Familiarity with **data visualization tools** : Tableau, Power BI, Matplotlib, Plotly.
+ Excellent communication and stakeholder management skills, with the ability to influence at senior levels.
**Additional Information**
All your information will be kept confidential according to EEO guidelines.
Data Engineering Consultant

Posted 1 day ago
Job Viewed
Job Description
**Primary Responsibilities:**
+ Responsible for managing the monthly data refreshes and custom process for the clients that includes extraction, loading and Data validation
+ Work closely with engineering, Implementation and downstream teams as the client data is refreshed, to answer questions and resolve data issues that arise
+ Investigate data anomalies to determine root cause, specify appropriate changes and work with engineering and downstream teams as the change is implemented and tested
+ Research client questions on data results by identifying underlying data elements leveraged and providing descriptions of data transformations involved
+ Participate in the ongoing invention, testing and use of tools used by the team to improve processes
+ Be innovative in finding opportunities to improve the process either through process improvement or automation
+ Partner with infrastructure team on migration activities and infrastructure changes related to the product or process
+ Leverage latest technologies and analyze large volumes of data to solve complex problems facing health care industry.
+ Build and improve standard operation procedures and troubleshooting documents
+ Report on metrics to surface meaningful results and identify areas for efficiency gain
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ 6+ years of experience working with data, analyzing data and understanding data
+ 6+ years of experience working with Relational database (SQL, Oracle)
+ 4+ years of experience working with Provider and Payer data
+ 2+ years of experience with AWS
+ Understanding of relational data bases and their principles of operation
+ Intermediate skills using Microsoft Excel and Microsoft Word
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
#NIC #NJP
Consultant, Data Engineering

Posted 1 day ago
Job Viewed
Job Description
Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods and models. What's more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data.
Join us to do the best work of your career and make a profound social impact as a **Consultant, Data Engineering** on our **Data Engineering** Team in **Bangalore.**
**What you'll achieve**
As a **Consultant, Data Engineering** , you will be responsible for developing technical tools and programs to automate the data management process, integrating medium to large structured data sets. You will have the opportunity to partner with Data Scientists, Architects or Businesses to design strategic projects and improve complex processes.
**You will:**
+ Design and build analytics solutions that deliver transformative insights from extremely large data sets.
+ Design, develop, and implement web applications for self-service delivery of analytics.
+ Design and develop APIs, database objects, machine learning algorithms, and necessary server-side code to support applications.
+ Work closely with team members to quickly integrate new components and features into current application ecosystem.
+ Continuously evaluate industry trends for opportunities to utilize new technologies and methodologies, and implement these into the solution stack as appropriate.
**Take the first step towards your dream career**
Every Dell Technologies team member brings something unique to the table. Here's what we are looking for with this role:
**Essential Requirements**
+ 13 to 18 years of experience in the using Data engineering technologies, including Big Data Tools like SQL Server, Teradata, Hadoop, Spark, and R.
+ Experience in big data platforms and solution designs to both technical and non-technical stakeholders.
+ Experience in machine learning and ability to write, tune, and debug performant SQL.
+ Experience in full stack development using Object-Oriented languages like, C#, .NET, HTML5, JavaScript.
+ Experience in Cloud Services, or Object Function Scripting Languages
**Desirable Requirements**
+ Demonstrated ability creating rich web interfaces using a modern client-side framework.
**Who we are**
We believe that each of us has the power to make an impact. That's why we put our team members at the center of everything we do. If you're looking for an opportunity to grow your career with some of the best minds and most advanced tech in the industry, we're looking for you.
Dell Technologies is a unique family of businesses that helps individuals and organizations transform how they work, live and play. Join us to build a future that works for everyone because Progress Takes All of Us.
**Application closing date: 15 Oct 2025**
Dell Technologies is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. Read the full Equal Employment Opportunity Policy here ( .
**Job ID:** R
Be The First To Know
About the latest Data engineering Jobs in India !
Data Engineering, Associate

Posted 1 day ago
Job Viewed
Job Description
At BlackRock, technology has always been at the core of what we do - and today, our technologists continue to shape the future of the industry with their innovative work. We are not only curious but also collaborative and eager to embrace experimentation as a means to solve complex challenges. Here you'll find an environment that promotes working across teams, businesses, regions and specialties - and a firm committed to supporting your growth as a technologist through curated learning opportunities, tech-specific career paths, and access to experts and leaders around the world.
We are seeking a highly skilled and motivated Senior level Data Engineer to join the Private Market Data Engineering team within Aladdin Data at BlackRock for driving our Private Market Data Engineering vision of making private markets more accessible and transparent for clients. In this role, you will work multi-functionally with Product, Data Research, Engineering, and Program management.
Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. The candidate will have extensive experience in developing data pipelines using Python, Java, Apache Airflow orchestration platform, DBT (Data Build Tool), Great Expectations for data validation, Apache Spark, MongoDB, Elasticsearch, Snowflake and PostgreSQL. In this role, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines. You will collaborate with various stakeholders to ensure the data pipelines are efficient, reliable, and meet the needs of the business.
**Key Responsibilities**
+ Design, develop, and maintain data pipelines using Aladdin Data Enterprise Data Platform framework
+ Develop ETL/ELT data pipelines using Python, SQL and deploy them as containerized apps on a Kubernetes cluster
+ Develop API for data distribution on top of the standard data model of the Enterprise Data Platform
+ Design and develop optimized back-end services in Java / Python for APIs to handle faster data retrieval and optimized processing
+ Develop reusable back-end services for data pipeline processing in Python / Java
+ Develop data transformation using DBT (Data Build Tool) with SQL or Python
+ Ensure data quality and integrity through automated testing and validation using tools like Great Expectations
+ Implement all observability requirements in the data pipeline
+ Optimize data workflows for performance and scalability
+ Monitor and troubleshoot data pipeline issues, ensuring timely resolution
+ Document data engineering processes and best practices whenever required
**Required Skills and Qualifications**
+ Must have 5 to 8 years of experience in data engineering, with a focus on building data pipelines and Data Services APIs
+ Strong server-side programming skills in Python and/or Java.
+ Experience working with backend microservices and APIs using Java and/or Python
+ Experience with Apache Airflow or any other orchestration framework for data orchestration
+ Proficiency in DBT for data transformation and modeling
+ Experience with data quality validation tools like Great Expectations or any other similar tools
+ Strong at writing SQL and experience with relational databases like SQL Server, PostgreSQL
+ Experience with cloud-based data warehouse platform like Snowflake
+ Experience working on NoSQL databases like Elasticsearch and MongoDB
+ Experience working with container orchestration platform like Kubernetes on AWS and/or Azure cloud environments
+ Experience on Cloud platforms like AWS and/or Azure
+ Ability to work collaboratively in a team environment
+ Need to possess critical skills of being detail oriented, passion to learn new technologies and good analytical and problem-solving skills
+ Experience with Financial Services application is a plus
+ Effective communication skills, both written and verbal
+ Bachelor's or Master's degree in computer science, Engineering, or a related field
**Our benefits**
To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.
**Our hybrid work model**
BlackRock's hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person - aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.
**About BlackRock**
At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children's educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.
This mission would not be possible without our smartest investment - the one we make in our employees. It's why we're dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive.
For additional information on BlackRock, please visit @blackrock ( | Twitter: @blackrock ( | LinkedIn: is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Data Engineering Internship

Posted 1 day ago
Job Viewed
Job Description
MUMBAI GENERAL OFFICE
Job Description
About the AMA Data Solutions & Engineering Team:
We take pride in managing the most-valuable asset of company in Digital World, called Data. Our vision is to deliver Data as a competitive advantage for AMA Business, by building unified data platforms, delivering customized BI tools for managers & empowering insightful business decisions through AI in Data.
In this role, you'll be constantly learning, staying up to date with industry trends and emerging technologies in data solutions. You'll have the chance to work with a variety of tools and technologies, including big data platforms, machine learning frameworks, and data visualization tools, to build innovative and effective solutions.
So, if you're excited about the possibilities of data, and eager to make a real impact in the world of business, a career in data solutions might be just what you're looking for. Join us and become a part of the future of digital transformation.
About P&G IT:
Digital is at the core of P&G's accelerated growth strategy. With this vision, IT in P&G is deeply embedded into every critical process across business organizations comprising 11+ category units globally creating impactful value through Transformation, Simplification & Innovation. IT in P&G is sub-divided into teams that engage strongly for revolutionizing the business processes to deliver exceptional value & growth - Digital GTM, Digital Manufacturing, Marketing Technologist, Ecommerce, Data Sciences & Analytics, Data Solutions & Engineering, Product Supply.
Responsibilities of the role
+ Understand the business requirements and convert into technical design of data pipelines and data models
+ Write code to ingest, transform and harmonize raw data into usable refined models
+ Analyze multiple data sets associated with the use cases in-scope in order to effectively design and develop the most optimal data models and transformation
+ Craft integrated systems, implementing ELT/ ETL jobs to fulfil business deliverables.
+ Performing sophisticated data operations such as data orchestration, transformation, and visualization with large datasets.
+ Coordinate with data asset managers, architects, and development team to ensure that the solution is fit for use and are meeting vital architectural requirements.
+ Demonstrate standard coding practices to ensure delivery excellence and reusability
About us
P&G was founded over 185 years ago as a simple soap and candle company. Today, we're the world's largest consumer goods company and home to iconic, trusted brands that make life a little bit easier in small but meaningful ways. We've spanned three centuries thanks to three simple ideas: leadership, innovation and citizenship. The insight, innovation and passion of hardworking teams has helped us grow into a global company that is governed responsibly and ethically, that is open and transparent, and that supports good causes and protects the environment. This is a place where you can be proud to work and do something that matters
Dedication from us:
You'll be at the core of breakthrough innovations, be given exciting assignments, lead initiatives, and take ownership and responsibility, in creative work spaces where new ideas flourish. All the while, you'll receive outstanding training to help you become a leader in your field. It is not just about what you'll do, but how you'll feel: encouraged, valued, purposeful, challenged, heard, and inspired.
What we offer:
Continuous mentorship - you will collaborate with passionate peers and receive both formal training as well as day-to-day mentoring from your manager dynamic and supportive work environment- employees are at the centre, we value every individual and support initiatives, promoting agility and work/life balance.
Just so you know:
We are an equal opportunity employer and value diversity at our company. Our mission of diversity and inclusion is: "everyone valued. Everyone included. Everyone performing at their peak".
Job Qualifications
+ At least 3 years of experience in Data Engineering
+ Hands-on experience in building data models, data pipelines, data ingestion, harmonization along with data governance.
+ Hands-on experience in scripting language like Python, R or Scala
+ Backend Development expertise on SQL Database, SQL Data Warehouse or any data warehousing solutions in cloud
+ Hands-on experience of reporting tools like Power BI or Tableau
+ Knowledge in DevOps Tools and CICD tools (e.g. Azure DevOps and Github)
+ Knowledge in cloud technologies (Azure Cloud) - at least 2 years inclusive of software engineering experience
+ Knowledge in Agile or Scrum methodologies with proven track record of successful projects
+ Graduate of Engineering or IT related course
Job Schedule
Full time
Job Number
R
Job Segmentation
Internships
Director Data Engineering

Posted today
Job Viewed
Job Description
We are seeking a visionary and technically adept Senior Data Engineering Leader to architect, scale, and optimize our data infrastructure. This role will drive the design and implementation of robust, cost-efficient, and observable data pipelines that power analytics, AI/ML, and operational systems across the enterprise. The ideal candidate will be a strategic thinker who can influence senior leadership and lead high-performing engineering teams.
**Primary Responsibilities:**
+ Data Pipeline Architecture: Design and implement scalable, high-performance data pipelines that support batch and real-time processing across diverse data domains
+ Total Cost of Ownership (TCO): Architect solutions with a focus on long-term sustainability, balancing performance, scalability, and cost efficiency
+ Operational Observability: Establish proactive monitoring, alerting, and logging frameworks to ensure system health, data quality, and SLA adherence
+ CI/CD & Automation: Champion automated testing, deployment, and release processes using modern DevOps practices. Ensure robust version control and rollback strategies
+ Blue-Green Deployments: Implement blue-green or canary deployment strategies to minimize downtime and risk during releases
+ Strategic Communication: Translate complex architectural decisions into business value. Confidently present and defend architectural choices to senior technology and business leaders
+ Leadership & Mentorship: Lead and mentor a team of data engineers, fostering a culture of innovation, accountability, and continuous improvement
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ Proven experience leading enterprise-scale data engineering initiatives
+ Hands-on experience with CI/CD pipelines, infrastructure-as-code (e.g., Terraform), and containerization (e.g., Docker, Kubernetes)
+ Experience with data governance, privacy, and compliance frameworks
+ Experience with AI/ML data pipelines and MLOps practices
+ Experience in healthcare, finance, or any other regulated industries
+ Deep expertise in data architecture, distributed systems, and cloud-native technologies (e.g., AWS, GCP, Azure)
+ Solid command of data modeling, ETL/ELT, orchestration tools (e.g., Airflow, dbt), and streaming platforms (e.g., Kafka)
+ Demonstrated success in implementing observability frameworks (e.g., Prometheus, Grafana, Datadog)
+ Proven excellent communication and stakeholder management skills
#Exetech
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._