684 Data Engineer jobs in Noida
Data Engineer
Posted 3 days ago
Job Viewed
Job Description
Job Title: Data Engineer
Location: Noida
Experience: 5+ years
Job Description: We are seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with a focus on PySpark, Python, and SQL. Experience with Azure Databricks is a plus.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and systems.
- Work closely with data scientists and analysts to ensure data quality and availability.
- Implement data integration and transformation processes using PySpark and Python.
- Optimize and maintain SQL databases and queries.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Monitor and troubleshoot data pipeline issues to ensure data integrity and performance.
Required Skills and Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 3+ years of experience in data engineering.
- Proficiency in PySpark, Python, and SQL.
- Experience with Azure Databricks is a plus.
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork abilities.
Preferred Qualifications:
- Experience with cloud platforms such as Azure, AWS, or Google Cloud.
- Knowledge of data warehousing concepts and technologies.
- Familiarity with ETL tools and processes.
How to Apply: Apart from Easy apply on Linkedin also Click on this link
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
Role: Data Engineer
Total years of experience: 9+ Years
Location: PAN India (Any HCL office)
Relevant experience for engagement: 5 Years
Note: - We are conducting a virtual discussion on this coming Saturday, 13th of Sep 25 from 9:30 am to 6:30 pm, if interested please feel free to share your profiles
Job Description:
Relevant Skills:
Maintain architecture principles, guidelines and standards
Data Warehousing
Programming Language: Python/Java
Big Data
Data Analytics
GCP Services
Experience in designing & implementing solution in mentioned areas:
Strong Google Cloud Platform Data Components – Big Query, Bigtable, Cloud SQL, Data proc, Data Flow, Data Fusion, Etc.
Professional Summary
- Technical Data Engineer who is strong on Data Warehousing, Big Data, Data Analytics
- Experience with developing software code in one or more languages such as Java and Python.
- Strong Google Cloud Platform Data Components – Big Query, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc
- Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering
- Hands on experience with Data Projects in cloud specifically GCP
- Sound knowledge of Data related concepts
- Demonstrated excellent communication, presentation, and problem-solving skills.
Good to have skills Good to have experience & knowledge of MPP systems, Database systems, ETL and ELT systems and Data Flow compute
- Should be able to advise the best of breed for the client solutions; Skills Needed
- The Data Engineer coaches the junior data engineering personnel position by bringing them up to speed and help them get better understanding of overall Data ecosystem.
- Prior experience developing, building and deploying on GCP
- Working on Solution deck, IP build, client meetings on requirement gathering
Certifications:
Google Professional Cloud architect (Good to have)
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
Your potential, unleashed.
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Database Engineer
Experience: 3+ Years
Skills
- Design, develop, and maintain efficient and scalable
- ETL/ELT data pipelines using Python or PySpark.
- Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into technical solutions.
- Perform data cleansing, transformation, and validation to ensure data quality and integrity.
- Optimize and troubleshoot performance issues in data processing jobs.
- Implement data integration solutions for various sources including databases, APIs, and file systems.
- Participate in code reviews, testing, and deployment processes.
- Maintain proper documentation for data workflows, systems, and best practices.
Qualifications:
- Bachelor’s degree in computer science, Engineering, or a related field.
- 3 to 5 years of hands-on experience as a Data Developer
- Proficient in Python and/or PySpark for data processing.
- Experience working with big data platforms such as Hadoop, Spark, or Databricks.
- Strong understanding of relational databases and SQL.
- Familiarity with data warehousing concepts and tools(e.g., Snowflake, Redshift, BigQuery) is a plus.
- Knowledge of cloud platforms (AWS, Azure, or GCP) is an advantage.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Data Engineer
Posted 11 days ago
Job Viewed
Job Description
Job Responsibilities
- Design, build & maintain scalable data pipelines for ingestion, processing & storage.
- Collaborate
Data Engineer
Posted 16 days ago
Job Viewed
Job Description
About the Job
Job title: Data Engineer(Immediate Joiner with Healthcare OR Pharma Client experience)
Location: Noida, India Onsite
Duration: Fulltime
Note : Need an immediate Joiner with strong experience in Healthcare/Pharma
Job description:
- Work closely with data scientists, analysts, and business stakeholders to understand data needs
- Ensure data quality, governance, and compliance with regulatory standards (e.g., GxP)
- Proficiency with SQL, Python, Spark, or related tools
- Experience with cloud platforms (AWS, Azure, or GCP)
- Pharma industry experience strongly preferred
- Excellent communication and problem-solving skills
Data Engineer
Posted today
Job Viewed
Job Description
Job Title: Data Engineer
Location: Noida
Experience: 5+ years
Job Description: We are seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with a focus on PySpark, Python, and SQL. Experience with Azure Databricks is a plus.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and systems.
- Work closely with data scientists and analysts to ensure data quality and availability.
- Implement data integration and transformation processes using PySpark and Python.
- Optimize and maintain SQL databases and queries.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Monitor and troubleshoot data pipeline issues to ensure data integrity and performance.
Required Skills and Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 3+ years of experience in data engineering.
- Proficiency in PySpark, Python, and SQL.
- Experience with Azure Databricks is a plus.
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork abilities.
Preferred Qualifications:
- Experience with cloud platforms such as Azure, AWS, or Google Cloud.
- Knowledge of data warehousing concepts and technologies.
- Familiarity with ETL tools and processes.
How to Apply: Apart from Easy apply on Linkedin also Click on this link
Data Engineer
Posted today
Job Viewed
Job Description
Role: Data Engineer
Total years of experience: 9+ Years
Location: PAN India (Any HCL office)
Relevant experience for engagement: 5 Years
Note: - We are conducting a virtual discussion on this coming Saturday, 13th of Sep 25 from 9:30 am to 6:30 pm, if interested please feel free to share your profiles
Job Description:
Relevant Skills:
Maintain architecture principles, guidelines and standards
Data Warehousing
Programming Language: Python/Java
Big Data
Data Analytics
GCP Services
Experience in designing & implementing solution in mentioned areas:
Strong Google Cloud Platform Data Components – Big Query, Bigtable, Cloud SQL, Data proc, Data Flow, Data Fusion, Etc.
Professional Summary
- Technical Data Engineer who is strong on Data Warehousing, Big Data, Data Analytics
- Experience with developing software code in one or more languages such as Java and Python.
- Strong Google Cloud Platform Data Components – Big Query, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc
- Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering
- Hands on experience with Data Projects in cloud specifically GCP
- Sound knowledge of Data related concepts
- Demonstrated excellent communication, presentation, and problem-solving skills.
Good to have skills Good to have experience & knowledge of MPP systems, Database systems, ETL and ELT systems and Data Flow compute
- Should be able to advise the best of breed for the client solutions; Skills Needed
- The Data Engineer coaches the junior data engineering personnel position by bringing them up to speed and help them get better understanding of overall Data ecosystem.
- Prior experience developing, building and deploying on GCP
- Working on Solution deck, IP build, client meetings on requirement gathering
Certifications:
Google Professional Cloud architect (Good to have)
Be The First To Know
About the latest Data engineer Jobs in Noida !
Data Engineer
Posted today
Job Viewed
Job Description
Job Responsibilities
- Design, build & maintain scalable data pipelines for ingestion, processing & storage.
- Collaborate with data scientists, analysts, and product teams to deliver high-quality data solutions.
- Optimize data systems for performance, reliability, scalability, and cost-efficiency.
- Implement data quality checks ensuring accuracy, completeness, and consistency.
- Work with structured & unstructured data from diverse sources.
- Develop & maintain data models, metadata, and documentation.
- Automate & monitor workflows using tools like Apache Airflow (or similar).
- Ensure data governance & security best practices are followed.
Required Skills & Qualifications
- Bachelor’s/Master’s degree in Computer Science, Engineering, or related field.
- 3–5 years of experience in data engineering, ETL development, or backend data systems.
- Proficiency in SQL & Python/Scala.
- Experience with big data tools (Spark, Hadoop, Kafka, etc.).
- Hands-on with cloud data platforms (AWS Redshift, GCP BigQuery, Azure Data Lake).
- Familiar with orchestration tools (Airflow, Luigi, etc.).
- Experience with data warehousing & data modeling.
- Strong analytical & problem-solving skills; ability to work independently & in teams.
Preferred Qualifications
- Experience with containerization (Docker, Kubernetes).
- Knowledge of CI/CD processes & Git version control.
- Understanding of data privacy regulations (GDPR, CCPA, etc.).
- Exposure to machine learning pipelines / MLOps is a plus.
Data Engineer
Posted today
Job Viewed
Job Description
About the Job
Job title: Data Engineer(Immediate Joiner with Healthcare OR Pharma Client experience)
Location: Noida, India Onsite
Duration: Fulltime
Note : Need an immediate Joiner with strong experience in Healthcare/Pharma
Job description:
- Work closely with data scientists, analysts, and business stakeholders to understand data needs
- Ensure data quality, governance, and compliance with regulatory standards (e.g., GxP)
- Proficiency with SQL, Python, Spark, or related tools
- Experience with cloud platforms (AWS, Azure, or GCP)
- Pharma industry experience strongly preferred
- Excellent communication and problem-solving skills
Data engineer
Posted today
Job Viewed
Job Description
Position: Software Engineer (Data)Location: Remote, IndiaAt Incept Labs, we believe the future of education and research lies in humans and AI workingtogether side by side. AI brings the ability to process knowledge at scale, while peoplecontribute imagination, values, and lived experience. When combined, they create a partnershipwhere each strengthens the other, opening new ways to discover, adapt, and grow.We are a small team of scientists, engineers, and builders who are passionate about buildingdomain-specific, next-generation AI solutions to enhance education and research.About This RoleWe're looking for a Software Engineer with deep expertise in large-scale data processing forLLM development. Data engineering is critical to successful model training and evaluation. You'll work directly with researchers to accelerate experiments, develop new datasets, improve infrastructure efficiency, and enable key insights across our data assets.You’ll join a high-impact, compact team responsible for both architecture and scaling of Incept’sdata and model development infrastructure, and work with highly complex, multi-modal data.ResponsibilitiesDesign, build, and operate scalable, fault-tolerant data infrastructure to supportdistributed computing and data orchestration for LLM ResearchDevelop and maintain high-throughput systems for data ingestion, processing, andtransformation to support LLM model developmentDevelop synthetic datasets using state-of-the-art solutionsCollaborate with research teams to deliver critical data assets for model development and evaluationImplement and maintain monitoring and alerting to support platform reliability and performanceBuild systems for traceability, reproducibility, and robust quality control to ensure adherence to industry compliance standardsRequired Qualifications5+ years of experience in data infrastructure, ideally supporting high-scale applications or research platforms.Fluent in distributed computing frameworks.Deeply familiar with cloud infrastructure, data storage architectures, and batch + streaming pipelines.Experience with specialized hardware (GPUs, TPUs) computing and GPU cluster.Strong knowledge of databases, storage systems, and how architecture choices impact performance at scale.Familiar with microservices architectures, containerization and orchestration, and both synchronous and asynchronous processing. Extensive experience with performance optimization and memory management in high-volume data systems.Proactive about documentation, automation, testing, and empowering your teammates with good tooling.This role is fully remote, India-based. Compensation and benefits will vary based on background, skills, and experience levels.