260 Data Engineer jobs in Delhi
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
Your potential, unleashed.
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Database Engineer
Experience: 3+ Years
Skills
- Design, develop, and maintain efficient and scalable
- ETL/ELT data pipelines using Python or PySpark.
- Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into technical solutions.
- Perform data cleansing, transformation, and validation to ensure data quality and integrity.
- Optimize and troubleshoot performance issues in data processing jobs.
- Implement data integration solutions for various sources including databases, APIs, and file systems.
- Participate in code reviews, testing, and deployment processes.
- Maintain proper documentation for data workflows, systems, and best practices.
Qualifications:
- Bachelor’s degree in computer science, Engineering, or a related field.
- 3 to 5 years of hands-on experience as a Data Developer
- Proficient in Python and/or PySpark for data processing.
- Experience working with big data platforms such as Hadoop, Spark, or Databricks.
- Strong understanding of relational databases and SQL.
- Familiarity with data warehousing concepts and tools(e.g., Snowflake, Redshift, BigQuery) is a plus.
- Knowledge of cloud platforms (AWS, Azure, or GCP) is an advantage.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Data engineer
Posted today
Job Viewed
Job Description
Data engineer
Posted today
Job Viewed
Job Description
Data engineer
Posted today
Job Viewed
Job Description
Data engineer
Posted today
Job Viewed
Job Description
Data Engineer
Posted today
Job Viewed
Job Description
Role : Data Engineer
Location : Remote
Shift Timing : 2:00 Pm - 11:00 Pm
Experience : 2- 4 years relevant only ( this is a Junior position with us )
Must have skillset :
GCP - 2 years minimum working Experience
Python and Pyspark - 2 years
SQL - 2 years
Excellent communication
Worked with global stakeholders
Who we are:
Randstad Sourceright’s global talent solutions provides instant access to experienced recruitment
and contingent workforce management support by combining technology, analytics and deep
global and local expertise. Our operations consist of the client aligned service delivery teams
operating across RPO, MSP and Blended Workforce Solutions. We are certified as a “great place to
work” for the last 3 consecutive years and are recognized as the best place to work by Glassdoor
Group Objective
The mission of the business intelligence team is to create a data-driven culture that empowers leaders to integrate data into daily decisions and strategic planning. We aim to provide visibility, transparency, and guidance regarding the quantity and quality of results, activities, financial KPIs, and leading indicators to identify trends aimed at data-based decision-making easily.
Position Objective
As a Senior Data Engineer, you will be responsible for designing, architecting, and implementing robust data solutions in a cloud-based environment (GCP). You will partner with other data engineers and technical teams to ensure the availability, reliability, and performance of our data systems.
Position Summary
Programming & Code Writing
- Architect and build complex data pipelines using advanced cloud data technologies
- Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency
- Define industry best practices for building data pipelines
- Ensure data security, compliance, and governance standards are met.
- Partner with leadership team to define and implement agile and DevOps methodologies
Consulting & Partnership
- Serve as subject matter expert and define data architecture and infrastructure requirements
- Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times
- Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations
- Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes
Troubleshooting & Continuous Improvement
- Design and implement a robust data observability process
- Resolve escalated reporting requests and communicate proactively and timely
- Troubleshoot, and provide technical guidance to resolve issues related to misaligned or inaccurate data or data fields or new customer requirements
- Maintain new release, migration, and sprint schedules for software upgrades, enhancements, and fixes to aid with product evolution
- Write QA/QC Scripts to conduct first round of testing and partner with BA team for test validation for new developments prior to moving to production
- Use industry knowledge & feedback to aid in the development of technology roadmap and future product(s) vision
- Document standard ways of working via QRGs, intranet pages, and video series
Senior activities
- Drive day-to-day development activities of development team in close collaboration with on-site and off-shore resources, scrum masters and product owners
- Bootstrapping a data engineering team at an early stage in the team’s evolution
- Provide leadership on technical front in difficult situations facilitate contentious discussions, and report up when necessary
- Guide, mentor and coach offshore resources
- Provide input in forming a long-term data strategy
Education
- Master’s degree in Computer Science / Information Technology or related field, highly preferred
Experience
- Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions
- Extensive Experience with data lakes, ETL and data warehouses
- Advanced experience of building data pipelines
- Passion for building quality BI software
- Project Management and/or process improvement experience highly preferred
Knowledge, Skills, and Abilities
- Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau
- Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered.
- Any exposure to Kafka, Spark, and Scala will be an added advantage.
- Should demonstrate a strong understanding of OOPS concepts and methodologies
- Expert level understanding of data engineering
- Intrinsic motivation and problem-solving
- Proactive leadership, project management, time management, and problem-solving skills
- Demonstrated continuous improvement, process documentation, and workflow skills
- Extensive experience with data analysis , modeling, and data pipelining including data cleaning, standardizing, scaling, tuning, scheduling and deployment
- Experience composing detailed technical documentation and procedures for data models
- Ability to prioritize and manage multiple projects, tasks, and meeting deadlines while maintaining quality
- Strong drive and commitment for delivering outstanding results
- Strong follow up and service orientation
Supervisory Responsibility
Provides guidance, leadership, or training to junior employees
Directly responsible for supervising non-exempt, clerical, or office administrative personnel
Directly responsible for supervising exempt, professional, or technical employees
Directly responsible for supervising supervisory/managerial employees
Organizational Structure:
Job Title this position reports: Manager, Data Engineering
Data Engineer
Posted today
Job Viewed
Job Description
Role : Data Engineer
Location : Remote
Shift Timing : 2:00 Pm - 11:00 Pm
Experience : 2- 4 years relevant only ( this is a Junior position with us )
Must have skillset :
GCP - 2 years minimum working Experience
Python and Pyspark - 2 years
SQL - 2 years
Excellent communication
Worked with global stakeholders
Who we are:
Randstad Sourceright’s global talent solutions provides instant access to experienced recruitment
and contingent workforce management support by combining technology, analytics and deep
global and local expertise. Our operations consist of the client aligned service delivery teams
operating across RPO, MSP and Blended Workforce Solutions. We are certified as a “great place to
work” for the last 3 consecutive years and are recognized as the best place to work by Glassdoor
Group Objective
The mission of the business intelligence team is to create a data-driven culture that empowers leaders to integrate data into daily decisions and strategic planning. We aim to provide visibility, transparency, and guidance regarding the quantity and quality of results, activities, financial KPIs, and leading indicators to identify trends aimed at data-based decision-making easily.
Position Objective
As a Senior Data Engineer, you will be responsible for designing, architecting, and implementing robust data solutions in a cloud-based environment (GCP). You will partner with other data engineers and technical teams to ensure the availability, reliability, and performance of our data systems.
Position Summary
Programming & Code Writing
- Architect and build complex data pipelines using advanced cloud data technologies
- Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency
- Define industry best practices for building data pipelines
- Ensure data security, compliance, and governance standards are met.
- Partner with leadership team to define and implement agile and DevOps methodologies
Consulting & Partnership
- Serve as subject matter expert and define data architecture and infrastructure requirements
- Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times
- Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations
- Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes
Troubleshooting & Continuous Improvement
- Design and implement a robust data observability process
- Resolve escalated reporting requests and communicate proactively and timely
- Troubleshoot, and provide technical guidance to resolve issues related to misaligned or inaccurate data or data fields or new customer requirements
- Maintain new release, migration, and sprint schedules for software upgrades, enhancements, and fixes to aid with product evolution
- Write QA/QC Scripts to conduct first round of testing and partner with BA team for test validation for new developments prior to moving to production
- Use industry knowledge & feedback to aid in the development of technology roadmap and future product(s) vision
- Document standard ways of working via QRGs, intranet pages, and video series
Senior activities
- Drive day-to-day development activities of development team in close collaboration with on-site and off-shore resources, scrum masters and product owners
- Bootstrapping a data engineering team at an early stage in the team’s evolution
- Provide leadership on technical front in difficult situations facilitate contentious discussions, and report up when necessary
- Guide, mentor and coach offshore resources
- Provide input in forming a long-term data strategy
Education
- Master’s degree in Computer Science / Information Technology or related field, highly preferred
Experience
- Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions
- Extensive Experience with data lakes, ETL and data warehouses
- Advanced experience of building data pipelines
- Passion for building quality BI software
- Project Management and/or process improvement experience highly preferred
Knowledge, Skills, and Abilities
- Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau
- Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered.
- Any exposure to Kafka, Spark, and Scala will be an added advantage.
- Should demonstrate a strong understanding of OOPS concepts and methodologies
- Expert level understanding of data engineering
- Intrinsic motivation and problem-solving
- Proactive leadership, project management, time management, and problem-solving skills
- Demonstrated continuous improvement, process documentation, and workflow skills
- Extensive experience with data analysis , modeling, and data pipelining including data cleaning, standardizing, scaling, tuning, scheduling and deployment
- Experience composing detailed technical documentation and procedures for data models
- Ability to prioritize and manage multiple projects, tasks, and meeting deadlines while maintaining quality
- Strong drive and commitment for delivering outstanding results
- Strong follow up and service orientation
Supervisory Responsibility
Provides guidance, leadership, or training to junior employees
Directly responsible for supervising non-exempt, clerical, or office administrative personnel
Directly responsible for supervising exempt, professional, or technical employees
Directly responsible for supervising supervisory/managerial employees
Organizational Structure:
Job Title this position reports: Manager, Data Engineering
Be The First To Know
About the latest Data engineer Jobs in Delhi !
Data Engineer
Posted today
Job Viewed
Job Description
Role : Data Engineer
Location : Remote
Shift Timing : 2:00 Pm - 11:00 Pm
Experience : 2- 4 years relevant only ( this is a Junior position with us )
Must have skillset :
GCP - 2 years minimum working Experience
Python and Pyspark - 2 years
SQL - 2 years
Excellent communication
Worked with global stakeholders
Who we are:
Randstad Sourceright’s global talent solutions provides instant access to experienced recruitment
and contingent workforce management support by combining technology, analytics and deep
global and local expertise. Our operations consist of the client aligned service delivery teams
operating across RPO, MSP and Blended Workforce Solutions. We are certified as a “great place to
work” for the last 3 consecutive years and are recognized as the best place to work by Glassdoor
Group Objective
The mission of the business intelligence team is to create a data-driven culture that empowers leaders to integrate data into daily decisions and strategic planning. We aim to provide visibility, transparency, and guidance regarding the quantity and quality of results, activities, financial KPIs, and leading indicators to identify trends aimed at data-based decision-making easily.
Position Objective
As a Senior Data Engineer, you will be responsible for designing, architecting, and implementing robust data solutions in a cloud-based environment (GCP). You will partner with other data engineers and technical teams to ensure the availability, reliability, and performance of our data systems.
Position Summary
Programming & Code Writing
- Architect and build complex data pipelines using advanced cloud data technologies
- Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency
- Define industry best practices for building data pipelines
- Ensure data security, compliance, and governance standards are met.
- Partner with leadership team to define and implement agile and DevOps methodologies
Consulting & Partnership
- Serve as subject matter expert and define data architecture and infrastructure requirements
- Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times
- Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations
- Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes
Troubleshooting & Continuous Improvement
- Design and implement a robust data observability process
- Resolve escalated reporting requests and communicate proactively and timely
- Troubleshoot, and provide technical guidance to resolve issues related to misaligned or inaccurate data or data fields or new customer requirements
- Maintain new release, migration, and sprint schedules for software upgrades, enhancements, and fixes to aid with product evolution
- Write QA/QC Scripts to conduct first round of testing and partner with BA team for test validation for new developments prior to moving to production
- Use industry knowledge & feedback to aid in the development of technology roadmap and future product(s) vision
- Document standard ways of working via QRGs, intranet pages, and video series
Senior activities
- Drive day-to-day development activities of development team in close collaboration with on-site and off-shore resources, scrum masters and product owners
- Bootstrapping a data engineering team at an early stage in the team’s evolution
- Provide leadership on technical front in difficult situations facilitate contentious discussions, and report up when necessary
- Guide, mentor and coach offshore resources
- Provide input in forming a long-term data strategy
Education
- Master’s degree in Computer Science / Information Technology or related field, highly preferred
Experience
- Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions
- Extensive Experience with data lakes, ETL and data warehouses
- Advanced experience of building data pipelines
- Passion for building quality BI software
- Project Management and/or process improvement experience highly preferred
Knowledge, Skills, and Abilities
- Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau
- Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered.
- Any exposure to Kafka, Spark, and Scala will be an added advantage.
- Should demonstrate a strong understanding of OOPS concepts and methodologies
- Expert level understanding of data engineering
- Intrinsic motivation and problem-solving
- Proactive leadership, project management, time management, and problem-solving skills
- Demonstrated continuous improvement, process documentation, and workflow skills
- Extensive experience with data analysis , modeling, and data pipelining including data cleaning, standardizing, scaling, tuning, scheduling and deployment
- Experience composing detailed technical documentation and procedures for data models
- Ability to prioritize and manage multiple projects, tasks, and meeting deadlines while maintaining quality
- Strong drive and commitment for delivering outstanding results
- Strong follow up and service orientation
Supervisory Responsibility
Provides guidance, leadership, or training to junior employees
Directly responsible for supervising non-exempt, clerical, or office administrative personnel
Directly responsible for supervising exempt, professional, or technical employees
Directly responsible for supervising supervisory/managerial employees
Organizational Structure:
Job Title this position reports: Manager, Data Engineering
Data Engineer
Posted today
Job Viewed
Job Description
Job Description:
Job Description
Experience: Minimum 5 yrs.
Work Location: Client office in Delhi. Remote working options are not available.
We are seeking a skilled Data Engineer with at least 5 years of experience to join our data analytics team, focusing on building robust data pipelines and systems to support the creation of dynamic dashboards. The role involves designing, building, and optimizing data architecture, enabling real-time data flow for visualization and analytics. The Data Engineer will be responsible for managing ETL processes, ensuring data quality, and supporting the scalable integration of various data sources into our analytics platform.
The ideal candidate should have extensive experience in working with complex data architectures, managing ETL workflows, and ensuring seamless data integration across platforms. They should also have a deep understanding of cloud technologies and database management.
Key Responsibilities:
•Data Pipeline Development
o Design, build, and maintain scalable ETL (Extract, Transform, Load) processes for collecting, storing, and processing structured and unstructured data from multiple sources.
o Develop workflows to automate data extraction from APIs, databases, and external sources.
o Ensure data pipelines are optimized for performance and handle large data volumes with minimal latency.
•Data Integration and Management
o Integrate data from various sources (., databases, APIs, cloud storage) into the centralized daIta warehouse or data lake to support real-time dashboards.
o Ensure smooth data flow and seamless integration with analytics tools like Power BI and Tableau.
o Manage and maintain data storage solutions, including relational (SQL-based) and NoSQL databases.
•Data Quality and Governance
o Implement data validation checks and quality assurance processes to ensure data accuracy, consistency, and integrity.
o Develop monitoring systems to identify and troubleshoot data inconsistencies, duplications, or errors during ingestion and processing.
o Ensure compliance with data governance policies and standards, including data protection regulations such as the Digital Personal Data Protection (DPDP) Act.
•Database Management and Optimization
o Design and manage both relational and NoSQL databases, ensuring efficient storage, query performance, and reliability.
o Optimize database performance, ensuring fast query execution times and efficient data retrieval for dashboard visualization.
o Implement data partitioning, indexing, and replication strategies to support large-scale data operations.
•Data Security and Compliance
o Ensure that all data processes adhere to security best practices, including encryption, authentication, and access control.
o Implement mechanisms for secure data storage and transmission, especially for sensitive government or public sector data.
o Conduct regular audits of data pipelines and storage systems to ensure compliance with relevant data protection regulations.
• Cloud Infrastructure and Deployment
o Deploy and manage cloud-based data solutions using AWS, Azure, or GCP, including data lakes, data warehouses, and cloud-native ETL tools.
o Set up cloud infrastructure to support high availability, fault tolerance, and scalability of data systems.
o Monitor cloud usage and optimize costs for data storage, processing, and retrieval.
•Performance Monitoring and Troubleshooting
o Continuously monitor data pipeline performance and data ingestion times to identify bottlenecks and areas for improvement
Troubleshoot and resolve any data flow issues, ensuring high availability and reliability of data for dashboards and analytics.
o Implement logging and alerting mechanisms to detect and address any operational issues proactively.
Qualifications:
•Education: Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field. A Master’s degree is a plus.
•Experience: At least 5 years of hands-on experience as a Data Engineer, preferably in a data analytics or dashboarding environment.
Data engineer
Posted today
Job Viewed