18,778 Data Engineer jobs in India
Data Engineer/AWS Data Engineer
Posted today
Job Viewed
Job Description
- Good working knowledge of AWS Cloud, including services like IAM, S3, EC2, Athena, CloudWatch Logs, CloudTrail, VPC, Subnets and Security Groups, and especially Sagemaker.
- Good working knowledge of Data Science, with regards to the data science use cases journey from data preparation, model building and training, model deployment, and model monitoring and support.
- Good working knowledge of CI/CD and Infrastructure as Code techniques, including experience on tools like GitHub and especially leveraging Terraform for Infrastructure provisioning in AWS Cloud.
Responsibilities:
- Deploy and manage cloud resources using Terraform following Infrastructure as Code (IaC) best practices.
- Automate provisioning of infrastructure across environments (dev, staging, prod).
- Build and integrate AWS Lambda functions to support serverless workflows.
- Build APIs leveraging API Gateway and considering elements like Route53, Hosted Zones and Load Balancers.
- Develop and manage workflows using AWS Step Functions to coordinate microservices and async tasks. Monitor and troubleshoot execution flows and step-level failures.
- Design and manage Amazon S3 buckets for object storage, including bucket policies, lifecycle rules, and event-based trigger
- Ensure secure access to S3 using IAM, bucket policies, and VPC endpoints.
- Create and configure VPCs, subnets (public/private), route tables, internet gateways, NAT gateways, and network ACLs.
- Set up and manage VPC Interface Endpoints (AWS PrivateLink) to securely connect to AWS services from within VPC.
- Implement network isolation and security best practices using Security Groups and Network ACLs
- Implementing IAM roles and policies for accessing AWS resources.
OR
AWS Data Engineers who has experience building Transformation under EMR / EC2 / Lambda / Glue function whichever necessary
Experience and knowledge in Spark, Scala, ControlM, Github, AWS services such as S3, Athena, EMR, EC2, Lambda, Glue and SQL.
This requirement is for Adobe AJO project where we need to build a data feed to share with Adobe for email marketing.
OR
Resources who have good experience in development of AWS components mainly Glue(Scala & Python), Lambda (python), Step Function, S3 and cloud watch as these are the components currently.
Also we are using Athena for the data query, so someone with little bit of hadoop knowledge can suffice as well with basics of RDBMS.
Terraform is something we are using for deployment so a familiarity with it would be good.
Git is compulsory as we use extensively.
Senior Data Engineer / Data Engineer
Posted today
Job Viewed
Job Description
Desired Experience: 3-8 years
Salary: Best-in-industry
Location: Gurgaon ( 5 days onsite)
Overview:
You will act as a key member of the Data consulting team, working directly with the partners and senior stakeholders of the clients designing and implementing big data and analytics solutions. Communication and organisation skills are keys for this position, along with a problem-solution attitude.
What is in it for you:
Opportunity to work with a world class team of business consultants and engineers solving some of the most complex business problems by applying data and analytics techniques
Fast track career growth in a highly entrepreneurial work environment
Best-in-industry renumeration package
Essential Technical Skills:
Technical expertise with emerging Big Data technologies, such as: Python, Spark, Hadoop, Clojure, Git, SQL and Databricks; and visualization tools: Tableau and PowerBI
Experience with cloud, container and micro service infrastructures
Experience working with divergent data sets that meet the requirements of the Data Science and Data Analytics teams
Hands-on experience with data modelling, query techniques and complexity analysis
Desirable Skills:
Experience/Knowledge of working in an agile environment and experience with agile methodologies such as Scrum
Experience of working with development teams and product owners to understand their requirement
Certifications on any of the above areas will be preferred.
Your duties will include:
Develop data solutions within a Big Data Azure and/or other cloud environments
Working with divergent data sets that meet the requirements of the Data Science and Data Analytics teams
Build and design Data Architectures using Azure Data factory, Databricks, Data lake, Synapse
Liaising with CTO, Product Owners and other Operations teams to deliver engineering roadmaps showing key items such as upgrades, technical refreshes and new versions
Perform data mapping activities to describe source data, target data and the high-level or detailed transformations that need to occur;
Assist Data Analyst team in developing KPIs and reporting in tools viz. Power BI, Tableau
Data Integration, Transformation, Modelling
Maintaining all relevant documentation and knowledge bases
Research and suggest new database products, services and protocols
Essential Personal Traits:
You should be able to work independently and communicate effectively with remote teams.
Timely communication/escalation of issues/dependencies to higher management.
Curiosity to learn and apply emerging technologies to solve business problems
** Interested candidate please send thier resume on - and **
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
**About the Job**
At Sanofi, we're committed to providing the next-gen healthcare that patients and customers need. It's about harnessing data insights and leveraging AI responsibly to search deeper and solve sooner than ever before. Join our Data Products and Platforms Team as a **Data Engineer** and you can help make it happen.
? **What you will be doing:** **?**
Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives.
The Data Products and Platforms Team is a key team within R&D Digital, focused on developing and delivering Data and AI products for R&D use cases. This team plays a critical role in pursuing broader democratization of data across R&D and providing the foundation to scale AI/ML, advanced analytics, and operational analytics capabilities.
As a **Data Engineer** , you will join this dynamic team committed to driving strategic and operational digital priorities and initiatives in R&D. You will work as a part of a Data Product Delivery Pod, lead by a Product Owner, in an agile environment to deliver Data & AI Products. As a part of this team, you will be responsible for the design and development of data pipelines and workflows to ingest, curate, process, and store large volumes of complex structured and unstructured data. You will have the ability to work on multiple data products serving multiple areas of the business.
**?**
**Our vision for digital, data analytics and AI?**
Join us on our journey in enabling Sanofi's Digital Transformation through becoming an AI first organization. This means:?
+ **AI Factory - Versatile Teams Operating in Cross Functional Pods:** Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment.
+ **Leading Edge Tech Stack:** Experience build products that will be deployed globally on a leading-edge tech stack.
+ **World Class Mentorship and Training:** Working with renown leaders and academics in machine learning to further develop your skillsets **.**
**?**
We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve people's lives. We're also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started?
**?**
**?** **Main Responsibilities:**
_Data Product Engineering:_
+ Provide input into the engineering feasibility of developing specific R&D Data/AI Products
+ Provide input to Data/AI Product Owner and Scrum Master to support with planning, capacity, and resource estimates
+ Design, build, and maintain scalable and reusable ETL / ELT pipelines to ingest, transform, clean, and load data from sources into central platforms / repositories
+ Structure and provision data to support modeling and data discovery, including filtering, tagging, joining, parsing and normalizing data
+ Collaborate with Data/AI Product Owner and Scrum Master to share progress on engineering activities and inform of any delays, issues, bugs, or risks with proposed remediation plans
+ Design, develop, and deploy APIs, data feeds, or specific features required by product design and user stories
+ Optimize data workflows to drive high performance and reliability of implemented data products
+ Oversee and support junior engineer with Data/AI Product testing requirements and execution
_Innovation & Team Collaboration:_
+ Stay current on industry trends, emerging technologies, and best practices in data product engineering
+ Contribute to a team culture of innovation, collaboration, and continuous learning within the product team
**?**
**About You:**
**Key Functional Requirements & Qualifications:**
+ Bachelor's degree in software engineering or related field, or equivalent work experience
+ 3+ years of experience in data product engineering, software engineering, or other related field
+ Understanding of R&D business and data environment preferred
+ Excellent communication and collaboration skills
+ Working knowledge and comfort working with Agile methodologies
**Key Technical Requirements & Qualifications:**
+ Proficiency with data analytics and statistical software (incl. SQL, Python, AWS, Snowflake, DBT, Airflow, Informatica)
+ Deep understanding and proven track record of developing data pipelines and workflows
**Pursue** **_progress_** **, discover** **_extraordinary_**
Better is out there. Better medications, better outcomes, better science. But progress doesn't happen without people - people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let's be those people.
At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, ability or gender identity.
Watch our ALL IN video ( and check out our Diversity Equity and Inclusion actions at sanofi.com ( !
Global Terms & Conditions and Data Privacy Statement ( is dedicated to supporting people through their health challenges. We are a global biopharmaceutical company focused on human health. We prevent illness with vaccines, provide innovative treatments to fight pain and ease suffering. We stand by the few who suffer from rare diseases and the millions with long-term chronic conditions.
With more than 100,000 people in 100 countries, Sanofi is transforming scientific innovation into healthcare solutions around the globe. Discover more about us visiting or via our movie We are Sanofi ( an organization, we change the practice of medicine; reinvent the way we work; and enable people to be their best versions in career and life. We are constantly moving and growing, making sure our people grow with us. Our working environment helps us build a dynamic and inclusive workplace operating on trust and respect and allows employees to live the life they want to live.
All in for Diversity, Equity and Inclusion at Sanofi - YouTube (
Data Engineer
Posted 2 days ago
Job Viewed
Job Description
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Data Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN).
**Key Responsibilities:**
- Develop data pipeline to integrate data movement tasks from multiple API data sources.
- Ensure data integrity, consistency, and normalization.
- Gather requirements from stakeholders to align with business needs.
- Collaborate with business analysts, data architects, and engineers to design solutions.
- Support ETL (Extract, Transform, Load) processes for data migration and integration.
- Ensure adherence to industry standards, security policies, and data governance frameworks.
- Keep up with industry trends in data modeling, big data, and AI/ML.
- Recommend improvements to data architecture for scalability and efficiency.
- Work with compliance teams to align data models with regulations (GDPR, HIPAA, etc.).
**Basic Qualifications:**
- 8+ years experience in professional servies or related field
- 3+ years experience working with databases such as Oracle, SQL Server and Azure cloud data platform.
- 3+ years of experience working with SQL tools.
- 2+ years of experience working with Azure Data Factory, Python
- 2+ years of experience working with API data integration tasks
**Preferred Qualifications:**
- Proven work experience as a Spark/PySpark development work
- Knowledge of database structure systems
- Excellent analytical and problem-solving skills
- Understanding of agile methodologies
- Undergraduate or Graduate degree preferred
- Ability to travel at least 25%.
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
Data Engineer
Posted 2 days ago
Job Viewed
Job Description
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Data Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN).
**Key Responsibilities:**
- Develop data pipeline to integrate data movement tasks from multiple API data sources.
- Ensure data integrity, consistency, and normalization.
- Gather requirements from stakeholders to align with business needs.
- Collaborate with business analysts, data architects, and engineers to design solutions.
- Support ETL (Extract, Transform, Load) processes for data migration and integration.
- Ensure adherence to industry standards, security policies, and data governance frameworks.
- Keep up with industry trends in data modeling, big data, and AI/ML.
- Recommend improvements to data architecture for scalability and efficiency.
- Work with compliance teams to align data models with regulations (GDPR, HIPAA, etc.).
**Basic Qualifications:**
- 8+ years experience in professional servies or related field
- 3+ years experience working with databases such as Oracle, SQL Server and Azure cloud data platform.
- 3+ years of experience working with SQL tools.
- 2+ years of experience working with Azure Data Factory, Python
- 2+ years of experience working with API data integration tasks
**Preferred Qualifications:**
- Proven work experience as a Spark/PySpark development work
- Knowledge of database structure systems
- Excellent analytical and problem-solving skills
- Understanding of agile methodologies
- Undergraduate or Graduate degree preferred
- Ability to travel at least 25%.
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
Data Engineer
Posted 3 days ago
Job Viewed
Job Description
HYDERABAD OFFICE INDIA
Job Description
Key Responsibilities:
+ Productionize pipelines for large, complex data sets which meet technical and business requirements.
+ Partner with data asset managers, architects, and development leads to ensure a sound technical solution.
+ Follow, and contribute to, coding standards and best practices to ensure pipelines and components are efficient, robust, cost effective and reusable.
+ Identify, design, and implement internal process improvements.
+ Optimize Spark jobs for performance and cost. Tune configurations, minimize shuffles, and leverage advanced techniques like broadcast joins, caching, and partitioning.
+ Ensure data quality, reliability, and performance by implementing best practices for data validation, monitoring, and optimization.
+ Monitor and troubleshoot data pipelines and workflows to ensure seamless operation.
+ Stay updated on the latest Databricks features, tools, and industry trends to continuously improve data engineering practices.
+ Strong understanding of distributed computing concepts and big data processing.
+ Excellent problem-solving skills and the ability to work collaboratively in a team environment.
Job Qualifications
+ Strong skills with Python, SQL, Delta Lake, Databricks, Spark/Pyspark, Github and Azure.
+ You will be expected to Attain and/or maintain technical certifications related to the role (Databricks, Azure)
+ Ability to use and implement CI/CD and associated tools such as Github Actions, SonarQube, Snyk
+ Familiarity or experience in one or more modern application development framework methods and tools (e.g. Disciplined Agile, Scrum).
+ Familiarity or experience with a range of data engineering best practices for development including query optimization, version control, code reviews, and documentation
+ The ability to build relationships and work in diverse, multidisciplinary teams
+ Excellent communication skills with business intuition and ability to understand business systems, versatility, and willingness to learn new technologies on the job
About us
We produce globally recognized brands and we grow the best business leaders in the industry. With a portfolio of trusted brands as diverse as ours, it is paramount our leaders are able to lead with courage the vast array of brands, categories and functions. We serve consumers around the world with one of the strongest portfolios of trusted, quality, leadership brands, including Always®, Ariel®, Gillette®, Head & Shoulders®, Herbal Essences®, Oral-B®, Pampers®, Pantene®, Tampax® and more. Our community includes operations in approximately 70 countries worldwide. Visit to know more. We are an equal opportunity employer and value diversity at our company. We do not discriminate against individuals on the basis of race, color, gender, age, national origin, religion, sexual orientation, gender identity or expression, marital status, citizenship, disability, HIV/AIDS status, or any other legally protected factor.
"At P&G, the hiring journey is personalized every step of the way, thereby ensuring equal opportunities for all, with a strong foundation of Ethics & Corporate Responsibility guiding everything we do. All the available job opportunities are posted either on our website - pgcareers.com, or on our official social media pages, for the convenience of prospective candidates, and do not require them to pay any kind of fees towards their application."
Job Schedule
Full time
Job Number
R000135017
Job Segmentation
Experienced Professionals
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
NCR Atleos, headquartered in Atlanta, is a leader in expanding financial access. Our dedicated 20,000 employees optimize the branch, improve operational efficiency and maximize self-service availability for financial institutions and retailers across the globe.
Data is at the heart of our global financial network. In fact, the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, including our customers, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to NCR Atleos, powering our future growth.
Data Engineer at NCR Atleos experience working at one of the largest and most recognized financial companies in the world, while being part of a software development team responsible for next generation technologies and solutions. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers.
We are looking for Data Engineer who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up modernizing and improving application capabilities.
**And Ideal candidate would have:**
+ BA/BS in Computer Science or equivalent practical experience
+ Overall 3+ years of experience on Data Analytics or Data Warehousing projects.
+ At least 2+ years of Cloud experience on AWS/Azure/GCP, preferred Azure.
+ Hands on experience in Microsoft Fabric or Databricks
+ Programming in Python, PySpark, with experience using pandas, ml libraries etc.
+ Orchestration frameworks like ADF, AirFlow
+ Experience in various data modelling techniques, such as ER, Hierarchical, Relational, or NoSQL modelling.
+ Excellent design, development, and tuning experience with SQL (OLTP and OLAP) databases.
+ Good understanding of data security and compliance, and related architecture
+ Experience with devops tools like Git, Maven, Jenkins, GitHub Actions, Azure DevOps
+ Experience with Agile development concepts and related tools.
+ Ability to tune and trouble shoot performance issues across the codebase and database queries.
+ Excellent problem-solving skills, with the ability to think critically and creatively to develop innovative data solutions.
+ Excellent written and strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience.
+ Passion for learning with a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment.
**Good to have Skills:**
+ Leverage machine learning and AI techniques on operationalizing data pipelines and building data products.
+ Provide data services using APIs.
Offers of employment are conditional upon passage of screening criteria applicable to the job.
**EEO Statement**
NCR Atleos is an equal-opportunity employer. It is NCR Atleos policy to hire, train, promote, and pay associates based on their job-related qualifications, ability, and performance, without regard to race, color, creed, religion, national origin, citizenship status, sex, sexual orientation, gender identity/expression, pregnancy, marital status, age, mental or physical disability, genetic information, medical condition, military or veteran status, or any other factor protected by law.
**Statement to Third Party Agencies**
To ALL recruitment agencies: NCR Atleos only accepts resumes from agencies on the NCR Atleos preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Atleos employees, or any NCR Atleos facility. NCR Atleos is not responsible for any fees or charges associated with unsolicited resumes.
Be The First To Know
About the latest Data engineer Jobs in India !
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
NCR Atleos, headquartered in Atlanta, is a leader in expanding financial access. Our dedicated 20,000 employees optimize the branch, improve operational efficiency and maximize self-service availability for financial institutions and retailers across the globe.
Data is at the heart of our global financial network. In fact, the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, including our customers, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to NCR Atleos, powering our future growth.
Data Engineer at NCR Atleos experience working at one of the largest and most recognized financial companies in the world, while being part of a software development team responsible for next generation technologies and solutions. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers.
We are looking for Data Engineer who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up modernizing and improving application capabilities.
**And Ideal candidate would have:**
+ BA/BS in Computer Science or equivalent practical experience
+ Overall 3+ years of experience on Data Analytics or Data Warehousing projects.
+ At least 2+ years of Cloud experience on AWS/Azure/GCP, preferred Azure.
+ Hands on experience in Microsoft Fabric or Databricks
+ Programming in Python, PySpark, with experience using pandas, ml libraries etc.
+ Orchestration frameworks like ADF, AirFlow
+ Experience in various data modelling techniques, such as ER, Hierarchical, Relational, or NoSQL modelling.
+ Excellent design, development, and tuning experience with SQL (OLTP and OLAP) databases.
+ Good understanding of data security and compliance, and related architecture
+ Experience with devops tools like Git, Maven, Jenkins, GitHub Actions, Azure DevOps
+ Experience with Agile development concepts and related tools.
+ Ability to tune and trouble shoot performance issues across the codebase and database queries.
+ Excellent problem-solving skills, with the ability to think critically and creatively to develop innovative data solutions.
+ Excellent written and strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience.
+ Passion for learning with a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment.
**Good to have Skills:**
+ Leverage machine learning and AI techniques on operationalizing data pipelines and building data products.
+ Provide data services using APIs.
Offers of employment are conditional upon passage of screening criteria applicable to the job.
**EEO Statement**
NCR Atleos is an equal-opportunity employer. It is NCR Atleos policy to hire, train, promote, and pay associates based on their job-related qualifications, ability, and performance, without regard to race, color, creed, religion, national origin, citizenship status, sex, sexual orientation, gender identity/expression, pregnancy, marital status, age, mental or physical disability, genetic information, medical condition, military or veteran status, or any other factor protected by law.
**Statement to Third Party Agencies**
To ALL recruitment agencies: NCR Atleos only accepts resumes from agencies on the NCR Atleos preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Atleos employees, or any NCR Atleos facility. NCR Atleos is not responsible for any fees or charges associated with unsolicited resumes.
Data Engineer

Posted 5 days ago
Job Viewed
Job Description
Come join us to create what's next. Let's define tomorrow, together.
**Description**
United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions.
Find your future at United! We're reinventing what our industry looks like, and what an airline can be - from the planes we fly to the people who fly them. When you join us, you're joining a global team of 100,000+ connected by a shared passion with a wide spectrum of experience and skills to lead the way forward.
Achieving our ambitions starts with supporting yours. Evolve your career and find your next opportunity. Get the care you need with industry-leading health plans and best-in-class programs to support your emotional, physical, and financial wellness. Expand your horizons with travel across the world's biggest route network. Connect outside your team through employee-led Business Resource Groups.
Create what's next with us. Let's define tomorrow together.
**Job overview and responsibilities**
Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus.
+ Data Engineer will be responsible to partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth
+ Design, develop, and implement streaming and near-real time data pipelines that feed systems that are the operational backbone of our business
+ Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution
+ This role will also drive the adoption of data processing and analysis within the Hadoop environment and help cross train other members of the team
+ Leverage strategic and analytical skills to understand and solve customer and business centric questions
+ Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners
+ Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business
+ Develop and implement innovative solutions leading to automation
+ Use of Agile methodologies to manage projects
+ Mentor and train junior engineers
**This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded.**
**Qualifications**
**What's needed to succeed (Minimum Qualifications):**
+ BS/BA, in computer science or related STEM field
+ 2+ years of IT experience in software development
+ 2+ years of development experience using Java, Python, Scala
+ 2+ years of experience with Big Data technologies like PySpark, Hadoop, Hive, HBASE, Kafka, Nifi
+ 2+ years of experience with relational database systems like MS SQL Server, Oracle, Teradata
+ Creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights
+ Individuals who have a natural curiosity and desire to solve problems are encouraged to apply
+ Must be legally authorized to work in India for any employer without sponsorship
+ Must be fluent in English and Hindi (written and spoken)
+ Successful completion of interview required to meet job qualification
+ Reliable, punctual attendance is an essential function of the position
**What will help you propel from the pack (Preferred Qualifications):**
+ Masters in computer science or related STEM field
+ Experience with cloud based systems like AWS, AZURE or Google Cloud
+ Certified Developer / Architect on AWS
+ Strong experience with continuous integration & delivery using Agile methodologies
+ Data engineering experience with transportation/airline industry
+ Strong problem-solving skills
+ Strong knowledge in Big Data
Data Engineer

Posted 5 days ago
Job Viewed
Job Description
Come join us to create what's next. Let's define tomorrow, together.
**Description**
United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions.
**Our Values:** At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation.
With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world.
**Job overview and responsibilities**
Our Digital Operations Center is constantly working to enhance the experience of our customers across our Digital Channels, based on data-driven analytics and timely and accurate reports. We are seeking a Senior Developer with deep expertise in building and maintaining cloud-native data platforms and pipelines using AWS and modern development practices. The ideal candidate is a hands-on engineer with experience in serverless compute, streaming data architectures, and DevOps automation, who thrives in a collaborative, fast-paced environment. This role will be instrumental in designing high-performance, scalable systems leveraging AWS services and the Well-Architected Framework.
- Design, build, and maintain scalable and efficient ETL/ELT pipelines using tools such as AWS Fargate, S3, Kinesis, and Flink or custom scripts.
- Integrate data from various sources including APIs, cloud services, databases, and flat files into a centralized data warehouse (e.g., Postgres, BigQuery, Redshift).
- Design and optimize SQL database schemas and queries, primarily on Amazon Aurora (MySQL/PostgreSQL), ensuring high performance and data integrity across workloads.
- Monitor, troubleshoot, and resolve data pipeline and infrastructure issues.
- Build CI/CD pipelines using GitHub Actions, driving automation in deployment and testing.
- Apply best practices for monitoring, alerting, cost optimization, and security - in line with AWS's Well-Architected Framework.
- Collaborate with cross-functional teams including product, analytics, and DevOps to design end-to-end solutions.
United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law.
**Required**
· Bachelor's degree in Computer Science, Computer Engineering, Electrical Engineering, Management Information Systems or related field.
· 2-5 years of experience in data engineering or a similar role.
· Strong SQL skills and experience working with relational databases (e.g., PostgreSQL, MySQL).
· Experience with cloud platforms like AWS, GCP, or Azure (e.g., Fargate, S3, Lambda, Aurora, Redis)
· Familiarity with data modeling and building data warehouses/lakes.
· Experience designing real-time or near-real-time data streaming pipelines.
· Proficiency in programming languages like Python or Scala.
· CI/CD knowledge, particularly using GitHub Actions or similar tools.
· Solid understanding of performance tuning, cost-effective cloud resource management, and data architecture principles.
· Understanding of data governance, quality, and security best practices.
· Must be legally authorized to work in India for any employer without sponsorship
**Qualifications**
**Preferred**
· MBA preferred
· Data Engineer :- AWS certifications (e.g., Developer Associate, Solutions Architect).
· Knowledge of modern data stack tools like dbt, Fivetran, Snowflake, or Looker.
· Exposure to containerization and orchestration tools (e.g., Docker, Kubernetes).
· Experience with data lake architectures or hybrid transactional/analytical processing systems.
· Familiarity with Agile development practices and cloud-native observability tools.
· Experience working with Quantum Metric and/or Dynatrace a plus