856 Data Engineer jobs in New Delhi
Data Engineer

Posted 8 days ago
Job Viewed
Job Description
NCR Atleos, headquartered in Atlanta, is a leader in expanding financial access. Our dedicated 20,000 employees optimize the branch, improve operational efficiency and maximize self-service availability for financial institutions and retailers across the globe.
Data is at the heart of our global financial network. In fact, the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, including our customers, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to NCR Atleos, powering our future growth.
Data Engineer at NCR Atleos experience working at one of the largest and most recognized financial companies in the world, while being part of a software development team responsible for next generation technologies and solutions. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers.
We are looking for Data Engineer who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up modernizing and improving application capabilities.
**And Ideal candidate would have:**
+ BA/BS in Computer Science or equivalent practical experience
+ Overall 3+ years of experience on Data Analytics or Data Warehousing projects.
+ At least 2+ years of Cloud experience on AWS/Azure/GCP, preferred Azure.
+ Hands on experience in Microsoft Fabric or Databricks
+ Programming in Python, PySpark, with experience using pandas, ml libraries etc.
+ Orchestration frameworks like ADF, AirFlow
+ Experience in various data modelling techniques, such as ER, Hierarchical, Relational, or NoSQL modelling.
+ Excellent design, development, and tuning experience with SQL (OLTP and OLAP) databases.
+ Good understanding of data security and compliance, and related architecture
+ Experience with devops tools like Git, Maven, Jenkins, GitHub Actions, Azure DevOps
+ Experience with Agile development concepts and related tools.
+ Ability to tune and trouble shoot performance issues across the codebase and database queries.
+ Excellent problem-solving skills, with the ability to think critically and creatively to develop innovative data solutions.
+ Excellent written and strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience.
+ Passion for learning with a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment.
**Good to have Skills:**
+ Leverage machine learning and AI techniques on operationalizing data pipelines and building data products.
+ Provide data services using APIs.
Offers of employment are conditional upon passage of screening criteria applicable to the job.
**EEO Statement**
NCR Atleos is an equal-opportunity employer. It is NCR Atleos policy to hire, train, promote, and pay associates based on their job-related qualifications, ability, and performance, without regard to race, color, creed, religion, national origin, citizenship status, sex, sexual orientation, gender identity/expression, pregnancy, marital status, age, mental or physical disability, genetic information, medical condition, military or veteran status, or any other factor protected by law.
**Statement to Third Party Agencies**
To ALL recruitment agencies: NCR Atleos only accepts resumes from agencies on the NCR Atleos preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Atleos employees, or any NCR Atleos facility. NCR Atleos is not responsible for any fees or charges associated with unsolicited resumes.
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
Your potential, unleashed.
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Data Engineer
Experience: 3+ Years
Skills
- Design, develop, and maintain efficient and scalable
- ETL/ELT data pipelines using Python or PySpark.
- Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into technical solutions.
- Perform data cleansing, transformation, and validation to ensure data quality and integrity.
- Optimize and troubleshoot performance issues in data processing jobs.
- Implement data integration solutions for various sources including databases, APIs, and file systems.
- Participate in code reviews, testing, and deployment processes.
- Maintain proper documentation for data workflows, systems, and best practices.
Qualifications:
- Bachelor’s degree in computer science, Engineering, or a related field.
- 3 to 5 years of hands-on experience as a Data Developer
- Proficient in Python and/or PySpark for data processing.
- Experience working with big data platforms such as Hadoop, Spark, or Databricks.
- Strong understanding of relational databases and SQL.
- Familiarity with data warehousing concepts and tools(e.g., Snowflake, Redshift, BigQuery) is a plus.
- Knowledge of cloud platforms (AWS, Azure, or GCP) is an advantage.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Data Engineer
Posted 6 days ago
Job Viewed
Job Description
Role: Data Engineer
Total years of experience: 9+ Years
Location: PAN India (Any HCL office)
Relevant experience for engagement: 5 Years
Note: - We are conducting a virtual discussion on this coming Saturday, 13th of Sep 25 from 9:30 am to 6:30 pm, if interested please feel free to share your profiles
Job Description:
Relevant Skills:
Maintain architecture principles, guidelines and standards
Data Warehousing
Programming Language: Python/Java
Big Data
Data Analytics
GCP Services
Experience in designing & implementing solution in mentioned areas:
Strong Google Cloud Platform Data Components – Big Query, Bigtable, Cloud SQL, Data proc, Data Flow, Data Fusion, Etc.
Professional Summary
- Technical Data Engineer who is strong on Data Warehousing, Big Data, Data Analytics
- Experience with developing software code in one or more languages such as Java and Python.
- Strong Google Cloud Platform Data Components – Big Query, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc
- Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering
- Hands on experience with Data Projects in cloud specifically GCP
- Sound knowledge of Data related concepts
- Demonstrated excellent communication, presentation, and problem-solving skills.
Good to have skills Good to have experience & knowledge of MPP systems, Database systems, ETL and ELT systems and Data Flow compute
- Should be able to advise the best of breed for the client solutions; Skills Needed
- The Data Engineer coaches the junior data engineering personnel position by bringing them up to speed and help them get better understanding of overall Data ecosystem.
- Prior experience developing, building and deploying on GCP
- Working on Solution deck, IP build, client meetings on requirement gathering
Certifications:
Google Professional Cloud architect (Good to have)
Data Engineer
Posted 6 days ago
Job Viewed
Job Description
Job Title: Data Engineer
Location: Noida
Experience: 5+ years
Job Description: We are seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with a focus on PySpark, Python, and SQL. Experience with Azure Databricks is a plus.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and systems.
- Work closely with data scientists and analysts to ensure data quality and availability.
- Implement data integration and transformation processes using PySpark and Python.
- Optimize and maintain SQL databases and queries.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Monitor and troubleshoot data pipeline issues to ensure data integrity and performance.
Required Skills and Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 3+ years of experience in data engineering.
- Proficiency in PySpark, Python, and SQL.
- Experience with Azure Databricks is a plus.
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork abilities.
Preferred Qualifications:
- Experience with cloud platforms such as Azure, AWS, or Google Cloud.
- Knowledge of data warehousing concepts and technologies.
- Familiarity with ETL tools and processes.
How to Apply: Apart from Easy apply on Linkedin also Click on this link
Be The First To Know
About the latest Data engineer Jobs in New Delhi !