19,496 Data Engineer jobs in India

Data Engineer

Varanasi, Uttar Pradesh Stier Solutions Inc

Job Viewed

Tap Again To Close

Job Description

Job Title: Junior Data Engineer

Company: Stier Solutions

Duration: Full-Time

Location: India (Remote)

Notice Period: Immediate or within 15 days

Experience: 0–2 years


About the Role:

We are hiring a Junior Data Engineer to join our growing data team. This role is ideal for recent graduates or early-career professionals who want to gain hands-on experience in building and maintaining data pipelines, working with cloud technologies, and supporting real-time data analytics projects.


Key Responsibilities:

  • Assist in building and maintaining reliable ETL/data pipelines
  • Work with structured and unstructured data from various sources
  • Write SQL queries and Python scripts for data transformation and processing
  • Support cloud-based data infrastructure (e.g., GCP, AWS, Azure)
  • Help integrate data from different platforms and systems
  • Monitor pipeline performance and troubleshoot data issues
  • Collaborate with analysts, engineers, and business teams to understand data needs
  • Maintain technical documentation of data workflows and processes
  • Ensure data accuracy, quality, and consistency


Required Skills:

  • Bachelor’s degree in Computer Science, IT, Engineering, or related field
  • Basic knowledge of SQL and Python
  • Familiarity with data formats like CSV, JSON, Parquet
  • Understanding of databases and data structures
  • Exposure to cloud platforms such as Google Cloud, AWS, or Azure
  • Strong problem-solving and analytical skills
  • Good communication and team collaboration skills
This advertiser has chosen not to accept applicants from your region.

Job No Longer Available

This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.

However, we have similar jobs available for you below.

Data Engineer- Lead Data Engineer

Bengaluru, Karnataka Paytm

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview We are seeking an experienced Lead Data Engineer to join our Data Engineering team at Paytm, India's leading digital payments and financial services platform. This is a critical role responsible for designing, building, and maintaining large-scale, real-time data streams that process billions of transactions and user interactions daily. Data accuracy and stream reliability are essential to our operations, as data quality issues can result in financial losses and impact customer a Lead Data Engineer at Paytm, you will be responsible for building robust data systems that support India's largest digital payments ecosystem. You'll architect and implement reliable, real-time data streaming solutions where precision and data correctness are fundamental requirements . Your work will directly support millions of users across merchant payments, peer-to-peer transfers, bill payments, and financial services, where data accuracy is crucial for maintaining customer confidence and operational excellence.This role requires expertise in designing fault-tolerant, scalable data architectures that maintain high uptime standards while processing peak transaction loads during festivals and high-traffic events. We place the highest priority on data quality and system reliability, as our customers depend on accurate, timely information for their financial decisions. You'll collaborate with cross-functional teams including data scientists, product managers, and risk engineers to deliver data solutions that enable real-time fraud detection, personalized recommendations, credit scoring, and regulatory compliance reporting.Key technical challenges include maintaining data consistency across distributed systems with demanding performance requirements, implementing comprehensive data quality frameworks with real-time validation, optimizing query performance on large datasets, and ensuring complete data lineage and governance across multiple business domains. At Paytm, reliable data streams are fundamental to our operations and our commitment to protecting customers' financial security and maintaining India's digital payments ResponsibilitiesData Stream Architecture & Development Design and implement reliable, scalable data streams handling high-volume transaction data with strong data integrity controlsBuild real-time processing systems using modern data engineering frameworks (Java/Python stack) with excellent performance characteristicsDevelop robust data ingestion systems from multiple sources with built-in redundancy and monitoring capabilitiesImplement comprehensive data quality frameworks, ensuring the 4 C's: Completeness, Consistency, Conformity, and Correctness - ensuring data reliability that supports sound business decisionsDesign automated data validation, profiling, and quality monitoring systems with proactive alerting capabilitiesInfrastructure & Platform Management Manage and optimize distributed data processing platforms with high availability requirements to ensure consistent service deliveryDesign data lake and data warehouse architectures with appropriate partitioning and indexing strategies for optimal query performanceImplement CI/CD processes for data engineering workflows with comprehensive testing and reliable deployment proceduresEnsure high availability and disaster recovery for critical data systems to maintain business continuityPerformance & Optimization Monitor and optimize streaming performance with focus on latency reduction and operational efficiencyImplement efficient data storage strategies including compression, partitioning, and lifecycle management with cost considerationsTroubleshoot and resolve complex data streaming issues in production environments with effective response protocolsConduct proactive capacity planning and performance tuning to support business growth and data volume increasesCollaboration & Leadership Work closely with data scientists, analysts, and product teams to understand important data requirements and service level expectationsMentor junior data engineers with emphasis on data quality best practices and customer-focused approachParticipate in architectural reviews and help establish data engineering standards that prioritize reliability and accuracyDocument technical designs, processes, and operational procedures with focus on maintainability and knowledge sharingRequired QualificationsExperience & Education Bachelor's or Master's degree in Computer Science, Engineering, or related technical field7+ years (Senior) of hands-on data engineering experienceProven experience with large-scale data processing systems (preferably in fintech/payments domain)Experience building and maintaining production data streams processing TB/PB scale data with strong performance and reliability standardsTechnical Skills & RequirementsProgramming Languages: Expert-level proficiency in both Python and Java; experience with Scala preferredBig Data Technologies: Apache Spark (PySpark, Spark SQL, Spark with Java), Apache Kafka, Apache AirflowCloud Platforms: AWS (EMR, Glue, Redshift, S3, Lambda) or equivalent Azure/GCP servicesDatabases: Strong SQL skills, experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, Redis)Data Quality Management: Deep understanding of the 4 C's framework - Completeness, Consistency, Conformity, and CorrectnessData Governance: Experience with data lineage tracking, metadata management, and data catalogingData Formats & Protocols: Parquet, Avro, JSON, REST APIs, GraphQLContainerization & DevOps: Docker, Kubernetes, Git, GitLab/GitHub with CI/CD pipeline experienceMonitoring & Observability: Experience with Prometheus, Grafana, or similar monitoring toolsData Modeling: Dimensional modeling, data vault, or similar methodologiesStreaming Technologies: Apache Flink, Kinesis, or Pulsar experience is a plusInfrastructure as Code: Terraform, CloudFormation (preferred)Java-specific: Spring Boot, Maven/Gradle, JUnit for building robust data servicesPreferred QualificationsDomain Expertise Previous experience in fintech, payments, or banking industry with solid understanding of regulatory compliance and financial data requirementsUnderstanding of financial data standards, PCI DSS compliance, and data privacy regulations where compliance is essential for business operationsExperience with real-time fraud detection or risk management systems where data accuracy is crucial for customer protectionAdvanced Technical Skills (Preferred) Experience building automated data quality frameworks covering all 4 C's dimensionsKnowledge of machine learning stream orchestration (MLflow, Kubeflow)Familiarity with data mesh or federated data architecture patternsExperience with change data capture (CDC) tools and techniquesLeadership & Soft Skills Strong problem-solving abilities with experience debugging complex distributed systems in production environmentsExcellent communication skills with ability to explain technical concepts to diverse stakeholders while highlighting business valueExperience mentoring team members and leading technical initiatives with focus on building a quality-oriented cultureProven track record of delivering projects successfully in dynamic, fast-paced financial technology environmentsWhat We Offer Opportunity to work with cutting-edge technology at scaleCompetitive salary and equity compensationComprehensive health and wellness benefitsProfessional development opportunities and conference attendanceFlexible working arrangementsChance to impact millions of users across India's digital payments ecosystemApplication Process Interested candidates should submit:Updated resume highlighting relevant data engineering experience with emphasis on real-time systems and data qualityPortfolio or GitHub profile showcasing data engineering projects, particularly those involving high-throughput streaming systemsCover letter explaining interest in fintech/payments domain and understanding of data criticality in financial servicesReferences from previous technical managers or senior colleagues who can attest to your data quality standards
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer / Data Engineer

Gurugram, Uttar Pradesh Invokhr

Posted today

Job Viewed

Tap Again To Close

Job Description

Desired Experience: 3-8 years

Salary: Best-in-industry

Location: Gurgaon ( 5 days onsite)


Overview:

You will act as a key member of the Data consulting team, working directly with the partners and senior stakeholders of the clients designing and implementing big data and analytics solutions. Communication and organisation skills are keys for this position, along with a problem-solution attitude.

What is in it for you:

Opportunity to work with a world class team of business consultants and engineers solving some of the most complex business problems by applying data and analytics techniques

Fast track career growth in a highly entrepreneurial work environment

Best-in-industry renumeration package

Essential Technical Skills:

Technical expertise with emerging Big Data technologies, such as: Python, Spark, Hadoop, Clojure, Git, SQL and Databricks; and visualization tools: Tableau and PowerBI

Experience with cloud, container and micro service infrastructures

Experience working with divergent data sets that meet the requirements of the Data Science and Data Analytics teams

Hands-on experience with data modelling, query techniques and complexity analysis

Desirable Skills:

Experience/Knowledge of working in an agile environment and experience with agile methodologies such as Scrum

Experience of working with development teams and product owners to understand their requirement

Certifications on any of the above areas will be preferred.

Your duties will include:

Develop data solutions within a Big Data Azure and/or other cloud environments

Working with divergent data sets that meet the requirements of the Data Science and Data Analytics teams

Build and design Data Architectures using Azure Data factory, Databricks, Data lake, Synapse

Liaising with CTO, Product Owners and other Operations teams to deliver engineering roadmaps showing key items such as upgrades, technical refreshes and new versions

Perform data mapping activities to describe source data, target data and the high-level or detailed transformations that need to occur;

Assist Data Analyst team in developing KPIs and reporting in tools viz. Power BI, Tableau

Data Integration, Transformation, Modelling

Maintaining all relevant documentation and knowledge bases

Research and suggest new database products, services and protocols

Essential Personal Traits:

You should be able to work independently and communicate effectively with remote teams.

Timely communication/escalation of issues/dependencies to higher management.

Curiosity to learn and apply emerging technologies to solve business problems


** Interested candidate please send thier resume on - and **

This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer / Data Engineer

Kochi, Kerala Invokhr

Posted today

Job Viewed

Tap Again To Close

Job Description

LOOKING FOR IMMEDIATE JOINERS OR 15 DAYS NOTICE PERIODS AND THIS IS WORK FROM HOME OPPORTUNITY

Position: Senior Data Engineer / Data Engineer

Desired Experience: 3-8 years

Salary: Best-in-industry

You will act as a key member of the Data consulting team, working directly with the partners and senior

stakeholders of the clients designing and implementing big data and analytics solutions. Communication

and organisation skills are keys for this position, along with a problem-solution attitude.

What is in it for you:

Opportunity to work with a world class team of business consultants and engineers solving some of

the most complex business problems by applying data and analytics techniques

Fast track career growth in a highly entrepreneurial work environment

Best-in-industry renumeration package

Essential Technical Skills:

Technical expertise with emerging Big Data technologies, such as: Python, Spark, Hadoop, Clojure,

Git, SQL and Databricks; and visualization tools: Tableau and PowerBI

Experience with cloud, container and micro service infrastructures

Experience working with divergent data sets that meet the requirements of the Data Science and

Data Analytics teams

Hands-on experience with data modelling, query techniques and complexity analysis

Desirable Skills:

Experience/Knowledge of working in an agile environment and experience with agile

methodologies such as Scrum

Experience of working with development teams and product owners to understand their

requirement

Certifications on any of the above areas will be preferred.

Your duties will include:

Develop data solutions within a Big Data Azure and/or other cloud environments

Working with divergent data sets that meet the requirements of the Data Science and Data Analytics

teams

Build and design Data Architectures using Azure Data factory, Databricks, Data lake, Synapse

Liaising with CTO, Product Owners and other Operations teams to deliver engineering roadmaps

showing key items such as upgrades, technical refreshes and new versions

Perform data mapping activities to describe source data, target data and the high-level or

detailed transformations that need to occur;

Assist Data Analyst team in developing KPIs and reporting in tools viz. Power BI, Tableau

Data Integration, Transformation, Modelling

Maintaining all relevant documentation and knowledge bases

Research and suggest new database products, services and protocols

Essential Personal Traits:

You should be able to work independently and communicate effectively with remote teams.

Timely communication/escalation of issues/dependencies to higher management.

Curiosity to learn and apply emerging technologies to solve business problems

This advertiser has chosen not to accept applicants from your region.

Data Engineer- Lead Data Engineer

Bengaluru, Karnataka Paytm

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview



We are seeking an experienced Lead Data Engineer to join our Data Engineering team at Paytm, India's leading digital payments and financial services platform. This is a critical role responsible for designing, building, and maintaining large-scale, real-time data streams that process billions of transactions and user interactions daily. Data accuracy and stream reliability are essential to our operations, as data quality issues can result in financial losses and impact customer trust.

As a Lead Data Engineer at Paytm, you will be responsible for building robust data systems that support India's largest digital payments ecosystem. You'll architect and implement reliable, real-time data streaming solutions where precision and data correctness are fundamental requirements . Your work will directly support millions of users across merchant payments, peer-to-peer transfers, bill payments, and financial services, where data accuracy is crucial for maintaining customer confidence and operational excellence.


This role requires expertise in designing fault-tolerant, scalable data architectures that maintain high uptime standards while processing peak transaction loads during festivals and high-traffic events. We place the highest priority on data quality and system reliability, as our customers depend on accurate, timely information for their financial decisions. You'll collaborate with cross-functional teams including data scientists, product managers, and risk engineers to deliver data solutions that enable real-time fraud detection, personalized recommendations, credit scoring, and regulatory compliance reporting.


Key technical challenges include maintaining data consistency across distributed systems with demanding performance requirements, implementing comprehensive data quality frameworks with real-time validation, optimizing query performance on large datasets, and ensuring complete data lineage and governance across multiple business domains. At Paytm, reliable data streams are fundamental to our operations and our commitment to protecting customers' financial security and maintaining India's digital payments infrastructure.


Key Responsibilities


Data Stream Architecture & Development Design and implement reliable, scalable data streams handling high-volume transaction data with strong data integrity controlsBuild real-time processing systems using modern data engineering frameworks (Java/Python stack) with excellent performance characteristicsDevelop robust data ingestion systems from multiple sources with built-in redundancy and monitoring capabilitiesImplement comprehensive data quality frameworks, ensuring the 4 C's: Completeness, Consistency, Conformity, and Correctness - ensuring data reliability that supports sound business decisionsDesign automated data validation, profiling, and quality monitoring systems with proactive alerting capabilities Infrastructure & Platform Management Manage and optimize distributed data processing platforms with high availability requirements to ensure consistent service deliveryDesign data lake and data warehouse architectures with appropriate partitioning and indexing strategies for optimal query performanceImplement CI/CD processes for data engineering workflows with comprehensive testing and reliable deployment proceduresEnsure high availability and disaster recovery for critical data systems to maintain business continuity


Performance & Optimization Monitor and optimize streaming performance with focus on latency reduction and operational efficiencyImplement efficient data storage strategies including compression, partitioning, and lifecycle management with cost considerationsTroubleshoot and resolve complex data streaming issues in production environments with effective response protocolsConduct proactive capacity planning and performance tuning to support business growth and data volume increases


Collaboration & Leadership Work closely with data scientists, analysts, and product teams to understand important data requirements and service level expectationsMentor junior data engineers with emphasis on data quality best practices and customer-focused approachParticipate in architectural reviews and help establish data engineering standards that prioritize reliability and accuracyDocument technical designs, processes, and operational procedures with focus on maintainability and knowledge sharing


Required Qualifications


Experience & Education Bachelor's or Master's degree in Computer Science, Engineering, or related technical field

7+ years (Senior) of hands-on data engineering experience

Proven experience with large-scale data processing systems (preferably in fintech/payments domain)

Experience building and maintaining production data streams processing TB/PB scale data with strong performance and reliability standards


Technical Skills & RequirementsProgramming Languages:

Expert-level proficiency in both Python and Java; experience with Scala preferred


Big Data Technologies: Apache Spark (PySpark, Spark SQL, Spark with Java), Apache Kafka, Apache Airflow

Cloud Platforms: AWS (EMR, Glue, Redshift, S3, Lambda) or equivalent Azure/GCP services

Databases: Strong SQL skills, experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, Redis)

Data Quality Management: Deep understanding of the 4 C's framework - Completeness, Consistency, Conformity, and Correctness

Data Governance: Experience with data lineage tracking, metadata management, and data cataloging

Data Formats & Protocols: Parquet, Avro, JSON, REST APIs, GraphQL Containerization & DevOps: Docker, Kubernetes, Git, GitLab/GitHub with CI/CD pipeline experience

Monitoring & Observability: Experience with Prometheus, Grafana, or similar monitoring tools

Data Modeling: Dimensional modeling, data vault, or similar methodologies

Streaming Technologies: Apache Flink, Kinesis, or Pulsar experience is a plus

Infrastructure as Code: Terraform, CloudFormation (preferred)

Java-specific: Spring Boot, Maven/Gradle, JUnit for building robust data services


Preferred Qualifications


Domain Expertise

Previous experience in fintech, payments, or banking industry with solid understanding of regulatory compliance and financial data requirementsUnderstanding of financial data standards, PCI DSS compliance, and data privacy regulations where compliance is essential for business operationsExperience with real-time fraud detection or risk management systems where data accuracy is crucial for customer protection


Advanced Technical Skills (Preferred)


Experience building automated data quality frameworks covering all 4 C's dimensionsKnowledge of machine learning stream orchestration (MLflow, Kubeflow)Familiarity with data mesh or federated data architecture patternsExperience with change data capture (CDC) tools and techniques


Leadership & Soft Skills Strong problem-solving abilities with experience debugging complex distributed systems in production environmentsExcellent communication skills with ability to explain technical concepts to diverse stakeholders while highlighting business valueExperience mentoring team members and leading technical initiatives with focus on building a quality-oriented cultureProven track record of delivering projects successfully in dynamic, fast-paced financial technology environments


What We Offer


Opportunity to work with cutting-edge technology at scaleCompetitive salary and equity compensation

Comprehensive health and wellness benefits

Professional development opportunities and conference attendanceFlexible working arrangements

Chance to impact millions of users across India's digital payments ecosystem


Application Process


Interested candidates should submit:

Updated resume highlighting relevant data engineering experience with emphasis on real-time systems and data quality

Portfolio or GitHub profile showcasing data engineering projects, particularly those involving high-throughput streaming systems

Cover letter explaining interest in fintech/payments domain and understanding of data criticality in financial services

References from previous technical managers or senior colleagues who can attest to your data quality standards






PI1255d80c7d41-30511-38316905

This advertiser has chosen not to accept applicants from your region.

Data Engineer

Bangalore, Karnataka ThermoFisher Scientific

Posted today

Job Viewed

Tap Again To Close

Job Description

**Work Schedule**
Standard (Mon-Fri)
**Environmental Conditions**
Office
**Job Description**
**Job Summary:**
We are seeking a skilled and detail-oriented **Data Engineer** to join our data team. The ideal candidate will be responsible for building and maintaining scalable data pipelines, extracting data from diverse sources including APIs, databases, and flat files, and ensuring high data quality and reliability. You will work closely with analysts, data scientists, and engineers to power data-driven decision-making across the organization.
**Key Responsibilities:**
+ Design, develop, and maintain scalable and robust data pipelines for both batch and real-time processing.
+ Extract, transform, and load (ETL) data from a wide variety of structured and unstructured data sources including:
+ RESTful and SOAP APIs
+ Databases (SQL, NoSQL)
+ Cloud storage (e.g., S3, Google Cloud Storage)
+ File formats (e.g., JSON, CSV, XML, Parquet)
+ Web scraping tools where appropriate
+ Build reusable data connectors and integration solutions to automate data ingestion.
+ Collaborate with internal stakeholders to understand data requirements and ensure accessibility and usability.
+ Monitor and optimize pipeline performance and troubleshoot data flow issues.
+ Ensure data governance, security, and quality standards are applied across all pipelines.
+ Experience with data manipulation and analysis libraries such as Pandas, Polars, or Dask for handling large datasets efficiently.
+ Design and create data flow and architecture diagrams to visually represent data pipelines, system integrations, and data models, ensuring clarity and alignment among technical and non-technical stakeholders.
**Requirements:**
**Technical Skills:**
+ Proficiency in SQL and at least one programming language (Python, Java, Scala).
+ Experience with data pipeline and workflow tools (e.g., Apache Airflow, AWS Data Pipeline).
+ Knowledge of relational and non-relational databases. (e.g., Oracle, SqlServer, MongoDB).
+ Strong data modeling and data warehousing skills.
**Education & Experience:**
+ Bachelor's degree in Computer Science, Engineering, Information Systems, or related field (Master's a plus).
+ 5+ years of experience in a data engineering or similar role.
**Soft Skills:**
+ Strong analytical and problem-solving abilities.
+ Excellent communication and collaboration skills.
+ Detail-oriented and proactive mindset.
Thermo Fisher Scientific is an EEO/Affirmative Action Employer and does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability or any other legally protected status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Bangalore, Karnataka NTT America, Inc.

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Req ID:** 338011
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Data Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN).
**Key Skills & Competencies**
+ Advanced SQL development (joins, CTEs, window functions, optimization)
+ Experience with ETL/ELT processes and tools
+ Data modeling (dimensional and normalized
+ Familiarity with version control (e.g., Git) and CI/CD practices
+ Understanding of DWH architectures and data integration patterns
+ Ability to work with large datasets and performance-tune queries
+ Platform-agnostic mindset with readiness to adapt to Azure or AWS
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Bangalore, Karnataka NTT DATA North America

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Req ID:** 338011
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Data Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN).
**Key Skills & Competencies**
+ Advanced SQL development (joins, CTEs, window functions, optimization)
+ Experience with ETL/ELT processes and tools
+ Data modeling (dimensional and normalized
+ Familiarity with version control (e.g., Git) and CI/CD practices
+ Understanding of DWH architectures and data integration patterns
+ Ability to work with large datasets and performance-tune queries
+ Platform-agnostic mindset with readiness to adapt to Azure or AWS
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in India !

Data Engineer

CAI

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Data Engineer
**Req number:**
R6008
**Employment type:**
Full time
**Worksite flexibility:**
Remote
**Who we are**
CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right-whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise.
**Job Summary**
As a Data Engineer, you will build data products using Databricks and related technologies.
**Job Description**
We are seeking a motivated **Data Engineer** that has experience in building data products using Databricks and related technologies. This is a **Full-time** and **Remote** position.
**What You'll Do**
+ Analyze and understand existing data warehouse implementations to support migration and consolidation efforts
+ Reverse-engineer legacy stored procedures (PL/SQL, SQL) and translate business logic into scalable Spark SQL code within Databricks notebooks
+ Design and develop data lake solutions on AWS using S3 and Delta Lake architecture, leveraging Databricks for processing and transformation
+ Build and maintain robust data pipelines using ETL tools with ingestion into S3 and processing in Databricks
+ Collaborate with data architects to implement ingestion and transformation frameworks aligned with enterprise standards
+ Evaluate and optimize data models (Star, Snowflake, Flattened) for performance and scalability in the new platform
+ Document ETL processes, data flows, and transformation logic to ensure transparency and maintainability
+ Perform foundational data administration tasks including job scheduling, error troubleshooting, performance tuning, and backup coordination
+ Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and backlog grooming
+ Triage, debug and fix technical issues related to Data Lakes
+ Maintain and manage Code repositories like Git
**What You'll Need**
Required:
+ 5+ years of experience working with **Databricks** , including Spark SQL and Delta Lake implementations
+ 3 + years of experience in designing and implementing data lake architectures on Databricks
+ Strong SQL and PL/SQL skills with the ability to interpret and refactor legacy stored procedures
+ Hands-on experience with data modeling and warehouse design principles
+ Proficiency in at least one programming language (Python, Scala, Java)
+ Bachelor's degree in Computer Science, Information Technology, Data Engineering, or related field
+ Experience working in Agile environments and contributing to iterative development cycles. Experience working on Agile projects and Agile methodology in general
+ Exposure to enterprise data governance and metadata management practices
Preferred:
+ Databricks cloud certification is a big plus
**Physical Demands**
+ This role involves mostly sedentary work, with occasional movement around the office to attend meetings, etc.
+ Ability to perform repetitive tasks on a computer, using a mouse, keyboard, and monitor
**Reasonable accommodation statement**
If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to or (888) 824 - 8111.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Pune, Maharashtra Red Hat

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

The Data Engineer exercises judgment when following general instructions and works with minimal instruction to support the integration and automation of data solutions. This role focuses on data massaging, reconciliation, and analysis, resolving routine to semi-routine issues. Responsibilities include creating optimized SQL queries, managing data pipelines, and collaborating with cross-functional teams to ensure data accuracy and availability.
**What will you do:**
+ Write optimized and scalable complex SQL queries
+ Automate data processing tasks using Python, focusing on cleaning and merging datasets.
+ Manage data pipelines, including scheduling, monitoring, and debugging workflows.
+ Collaborate with data engineers and IT teams to maintain data accessibility for stakeholders.
+ Assist in developing automated tests to ensure the accuracy and integrity of data.
+ Participate in version control and CI/CD processes for deploying and testing pipeline changes across environments.
+ Work cross-functionally with analysts, engineers, and operations.
+ Data stewardship including: data governance, data compliance, data transformation, data cleanliness, data validation, data audit/maintenance.
+ Writing complex, highly-optimized SQL queries across large datasets, involved in SQL Query tuning and provided tuning recommendations
+ Experienced in Data Analytics, hands-on experience of various Python libraries such as NumPy and Pandas
+ Python development experience to massage, clean data and automate data extract and loads
+ Expertise to convert raw data to processed data by merging, finding outliers, errors, trends, missing values and distributions in the data
+ Expertise in Creating, Debugging, Scheduling and Monitoring jobs using Airflow, resolve performance tuning related issues and queries
+ Foster collaboration among Data engineers, IT & other business groups to ensure data is accessible to FP&A team
+ Scheduled a regular hot backup process and involved in the backup activities
+ Strong analytical and problem-solving skills with ability to represent complex algorithms in software
+ Develop automated unit tests, end-to-end tests, and integration tests to assist in quality assurance (QA) procedures
**What will you bring:**
+ Bachelor's or Master's degree in Computer Science, IT, Engineering or equivalent
+ 5+ years of experience as a Data Engineer, BI Engineer, Systems Analyst in a company with large, complex data sources
+ Working knowledge of DBT, Snowflake, Fivetran, Git and SQL or Python programming skills for data querying, cleaning, and presentation
+ Build highly available, reliable and secured API solutions, experience working with REST API design and Implementation
+ Working knowledge of relational databases (PostgreSQL, MSSQL, etc.), experience with AWS services including S3, Redshift, EMR and RDS
+ Ability to manage multiple projects at the same time in a fast-paced team environment, across time zones, and with different cultures, while maintaining ability to work as part of a team
+ The candidate must have good troubleshooting skills and be able to think through issues and problems in a logical manner and planning knowledge would be an added advantage
+ Detail-oriented and enthusiastic who is also focused and diligent on delivering results
**About Red Hat**
Red Hat ( is the world's leading provider of enterprise open source ( software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office, to office-flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact.
**Inclusion at Red Hat**
Red Hat's culture is built on the open source principles of transparency, collaboration, and inclusion, where the best ideas can come from anywhere and anyone. When this is realized, it empowers people from different backgrounds, perspectives, and experiences to come together to share ideas, challenge the status quo, and drive innovation. Our aspiration is that everyone experiences this culture with equal opportunity and access, and that all voices are not only heard but also celebrated. We hope you will join our celebration, and we welcome and encourage applicants from all the beautiful dimensions that compose our global village.
**Equal Opportunity Policy (EEO)**
Red Hat is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law.
**Red Hat does not seek or accept unsolicited resumes or CVs from recruitment agencies. We are not responsible for, and will not pay, any fees, commissions, or any other payment related to unsolicited resumes or CVs except as required in a written contract between Red Hat and the recruitment agency or party requesting payment of a fee.**
**Red Hat supports individuals with disabilities and provides reasonable accommodations to job applicants. If you need assistance completing our online job application, email** ** ** **. General inquiries, such as those regarding the status of a job application, will not receive a reply.**
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Gurugram, Uttar Pradesh United Airlines

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network.
Come join us to create what's next. Let's define tomorrow, together.
**Description**
United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions.
Find your future at United! We're reinventing what our industry looks like, and what an airline can be - from the planes we fly to the people who fly them. When you join us, you're joining a global team of 100,000+ connected by a shared passion with a wide spectrum of experience and skills to lead the way forward.
Achieving our ambitions starts with supporting yours. Evolve your career and find your next opportunity. Get the care you need with industry-leading health plans and best-in-class programs to support your emotional, physical, and financial wellness. Expand your horizons with travel across the world's biggest route network. Connect outside your team through employee-led Business Resource Groups.
Create what's next with us. Let's define tomorrow together.
**Job overview and responsibilities**
Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus.
+ Data Engineer will be responsible to partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth
+ Design, develop, and implement streaming and near-real time data pipelines that feed systems that are the operational backbone of our business
+ Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution
+ This role will also drive the adoption of data processing and analysis within the Hadoop environment and help cross train other members of the team
+ Leverage strategic and analytical skills to understand and solve customer and business centric questions
+ Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners
+ Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business
+ Develop and implement innovative solutions leading to automation
+ Use of Agile methodologies to manage projects
+ Mentor and train junior engineers
**This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded.**
**Qualifications**
**What's needed to succeed (Minimum Qualifications):**
+ BS/BA, in computer science or related STEM field
+ 2+ years of IT experience in software development
+ 2+ years of development experience using Java, Python, Scala
+ 2+ years of experience with Big Data technologies like PySpark, Hadoop, Hive, HBASE, Kafka, Nifi
+ 2+ years of experience with relational database systems like MS SQL Server, Oracle, Teradata
+ Creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights
+ Individuals who have a natural curiosity and desire to solve problems are encouraged to apply
+ Must be legally authorized to work in India for any employer without sponsorship
+ Must be fluent in English and Hindi (written and spoken)
+ Successful completion of interview required to meet job qualification
+ Reliable, punctual attendance is an essential function of the position
**What will help you propel from the pack (Preferred Qualifications):**
+ Masters in computer science or related STEM field
+ Experience with cloud based systems like AWS, AZURE or Google Cloud
+ Certified Developer / Architect on AWS
+ Strong experience with continuous integration & delivery using Agile methodologies
+ Data engineering experience with transportation/airline industry
+ Strong problem-solving skills
+ Strong knowledge in Big Data
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineer Jobs