21,544 Data Engineer jobs in India

Data Engineer

Pune, Maharashtra SoftClouds

Job Viewed

Tap Again To Close

Job Description

Job Title: Oracle Data Integrator

Job Location: Remote/ India


Job Description:

·   Maintains and develops Integrations in ODI

·   Should be strong enough to analyze PLSQL and SQL query-related issues.

·   Should be able to write and optimize SQL queries.

·   Analyze the data and meta-data member update-related issues with source and boundary systems. should have exposure to mapping different data sources like ERP, EPM, databases, files, etc.

·   downstream reporting applications (SAP, Business Objects/ FMIS, Essbase, PBCS) by leveraging FDMEE, load rules, SQL routines, and other technology.

·   Assist with technical infrastructure issues.

·   Works with internal business stakeholders to analyze and document business requirements, processes, and related business rules.

·   Ability to work independently and be goal-oriented, work as part of a team.

·   Define and manage process improvement implementations that support the optimal technology solution.

·   Standardize data naming, establish consistent data definitions, monitoring/audit the data quality to ensure data is clean and credible to support operational and strategic decision-making.

·   Demonstrates an eye for technical skills and is detail-oriented with strong analytical skills and

ability to communicate ideas to stakeholders and users of the system in “non-technical.

terms” to effectively meet their business needs.

·   Works as a consulting team member or an individual technical consultant in the

development of technical solutions of moderate complexity within the consulting practice area


Education Qualification:

  • Bachelor’s Degree in Computer Science or equivalent
This advertiser has chosen not to accept applicants from your region.

Job No Longer Available

This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.

However, we have similar jobs available for you below.

Data Engineer- Lead Data Engineer

Bengaluru, Karnataka Paytm

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview



We are seeking an experienced Lead Data Engineer to join our Data Engineering team at Paytm, India's leading digital payments and financial services platform. This is a critical role responsible for designing, building, and maintaining large-scale, real-time data streams that process billions of transactions and user interactions daily. Data accuracy and stream reliability are essential to our operations, as data quality issues can result in financial losses and impact customer trust.

As a Lead Data Engineer at Paytm, you will be responsible for building robust data systems that support India's largest digital payments ecosystem. You'll architect and implement reliable, real-time data streaming solutions where precision and data correctness are fundamental requirements . Your work will directly support millions of users across merchant payments, peer-to-peer transfers, bill payments, and financial services, where data accuracy is crucial for maintaining customer confidence and operational excellence.


This role requires expertise in designing fault-tolerant, scalable data architectures that maintain high uptime standards while processing peak transaction loads during festivals and high-traffic events. We place the highest priority on data quality and system reliability, as our customers depend on accurate, timely information for their financial decisions. You'll collaborate with cross-functional teams including data scientists, product managers, and risk engineers to deliver data solutions that enable real-time fraud detection, personalized recommendations, credit scoring, and regulatory compliance reporting.


Key technical challenges include maintaining data consistency across distributed systems with demanding performance requirements, implementing comprehensive data quality frameworks with real-time validation, optimizing query performance on large datasets, and ensuring complete data lineage and governance across multiple business domains. At Paytm, reliable data streams are fundamental to our operations and our commitment to protecting customers' financial security and maintaining India's digital payments infrastructure.


Key Responsibilities


Data Stream Architecture & Development Design and implement reliable, scalable data streams handling high-volume transaction data with strong data integrity controlsBuild real-time processing systems using modern data engineering frameworks (Java/Python stack) with excellent performance characteristicsDevelop robust data ingestion systems from multiple sources with built-in redundancy and monitoring capabilitiesImplement comprehensive data quality frameworks, ensuring the 4 C's: Completeness, Consistency, Conformity, and Correctness - ensuring data reliability that supports sound business decisionsDesign automated data validation, profiling, and quality monitoring systems with proactive alerting capabilities Infrastructure & Platform Management Manage and optimize distributed data processing platforms with high availability requirements to ensure consistent service deliveryDesign data lake and data warehouse architectures with appropriate partitioning and indexing strategies for optimal query performanceImplement CI/CD processes for data engineering workflows with comprehensive testing and reliable deployment proceduresEnsure high availability and disaster recovery for critical data systems to maintain business continuity


Performance & Optimization Monitor and optimize streaming performance with focus on latency reduction and operational efficiencyImplement efficient data storage strategies including compression, partitioning, and lifecycle management with cost considerationsTroubleshoot and resolve complex data streaming issues in production environments with effective response protocolsConduct proactive capacity planning and performance tuning to support business growth and data volume increases


Collaboration & Leadership Work closely with data scientists, analysts, and product teams to understand important data requirements and service level expectationsMentor junior data engineers with emphasis on data quality best practices and customer-focused approachParticipate in architectural reviews and help establish data engineering standards that prioritize reliability and accuracyDocument technical designs, processes, and operational procedures with focus on maintainability and knowledge sharing


Required Qualifications


Experience & Education Bachelor's or Master's degree in Computer Science, Engineering, or related technical field

7+ years (Senior) of hands-on data engineering experience

Proven experience with large-scale data processing systems (preferably in fintech/payments domain)

Experience building and maintaining production data streams processing TB/PB scale data with strong performance and reliability standards


Technical Skills & RequirementsProgramming Languages:

Expert-level proficiency in both Python and Java; experience with Scala preferred


Big Data Technologies: Apache Spark (PySpark, Spark SQL, Spark with Java), Apache Kafka, Apache Airflow

Cloud Platforms: AWS (EMR, Glue, Redshift, S3, Lambda) or equivalent Azure/GCP services

Databases: Strong SQL skills, experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, Redis)

Data Quality Management: Deep understanding of the 4 C's framework - Completeness, Consistency, Conformity, and Correctness

Data Governance: Experience with data lineage tracking, metadata management, and data cataloging

Data Formats & Protocols: Parquet, Avro, JSON, REST APIs, GraphQL Containerization & DevOps: Docker, Kubernetes, Git, GitLab/GitHub with CI/CD pipeline experience

Monitoring & Observability: Experience with Prometheus, Grafana, or similar monitoring tools

Data Modeling: Dimensional modeling, data vault, or similar methodologies

Streaming Technologies: Apache Flink, Kinesis, or Pulsar experience is a plus

Infrastructure as Code: Terraform, CloudFormation (preferred)

Java-specific: Spring Boot, Maven/Gradle, JUnit for building robust data services


Preferred Qualifications


Domain Expertise

Previous experience in fintech, payments, or banking industry with solid understanding of regulatory compliance and financial data requirementsUnderstanding of financial data standards, PCI DSS compliance, and data privacy regulations where compliance is essential for business operationsExperience with real-time fraud detection or risk management systems where data accuracy is crucial for customer protection


Advanced Technical Skills (Preferred)


Experience building automated data quality frameworks covering all 4 C's dimensionsKnowledge of machine learning stream orchestration (MLflow, Kubeflow)Familiarity with data mesh or federated data architecture patternsExperience with change data capture (CDC) tools and techniques


Leadership & Soft Skills Strong problem-solving abilities with experience debugging complex distributed systems in production environmentsExcellent communication skills with ability to explain technical concepts to diverse stakeholders while highlighting business valueExperience mentoring team members and leading technical initiatives with focus on building a quality-oriented cultureProven track record of delivering projects successfully in dynamic, fast-paced financial technology environments


What We Offer


Opportunity to work with cutting-edge technology at scaleCompetitive salary and equity compensation

Comprehensive health and wellness benefits

Professional development opportunities and conference attendanceFlexible working arrangements

Chance to impact millions of users across India's digital payments ecosystem


Application Process


Interested candidates should submit:

Updated resume highlighting relevant data engineering experience with emphasis on real-time systems and data quality

Portfolio or GitHub profile showcasing data engineering projects, particularly those involving high-throughput streaming systems

Cover letter explaining interest in fintech/payments domain and understanding of data criticality in financial services

References from previous technical managers or senior colleagues who can attest to your data quality standards






PI1255d80c7d

This advertiser has chosen not to accept applicants from your region.

Data Engineer- Lead Data Engineer

Bengaluru, Karnataka Paytm

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview We are seeking an experienced Lead Data Engineer to join our Data Engineering team at Paytm, India's leading digital payments and financial services platform. This is a critical role responsible for designing, building, and maintaining large-scale, real-time data streams that process billions of transactions and user interactions daily. Data accuracy and stream reliability are essential to our operations, as data quality issues can result in financial losses and impact customer a Lead Data Engineer at Paytm, you will be responsible for building robust data systems that support India's largest digital payments ecosystem. You'll architect and implement reliable, real-time data streaming solutions where precision and data correctness are fundamental requirements . Your work will directly support millions of users across merchant payments, peer-to-peer transfers, bill payments, and financial services, where data accuracy is crucial for maintaining customer confidence and operational excellence.This role requires expertise in designing fault-tolerant, scalable data architectures that maintain high uptime standards while processing peak transaction loads during festivals and high-traffic events. We place the highest priority on data quality and system reliability, as our customers depend on accurate, timely information for their financial decisions. You'll collaborate with cross-functional teams including data scientists, product managers, and risk engineers to deliver data solutions that enable real-time fraud detection, personalized recommendations, credit scoring, and regulatory compliance reporting.Key technical challenges include maintaining data consistency across distributed systems with demanding performance requirements, implementing comprehensive data quality frameworks with real-time validation, optimizing query performance on large datasets, and ensuring complete data lineage and governance across multiple business domains. At Paytm, reliable data streams are fundamental to our operations and our commitment to protecting customers' financial security and maintaining India's digital payments ResponsibilitiesData Stream Architecture & Development Design and implement reliable, scalable data streams handling high-volume transaction data with strong data integrity controlsBuild real-time processing systems using modern data engineering frameworks (Java/Python stack) with excellent performance characteristicsDevelop robust data ingestion systems from multiple sources with built-in redundancy and monitoring capabilitiesImplement comprehensive data quality frameworks, ensuring the 4 C's: Completeness, Consistency, Conformity, and Correctness - ensuring data reliability that supports sound business decisionsDesign automated data validation, profiling, and quality monitoring systems with proactive alerting capabilitiesInfrastructure & Platform Management Manage and optimize distributed data processing platforms with high availability requirements to ensure consistent service deliveryDesign data lake and data warehouse architectures with appropriate partitioning and indexing strategies for optimal query performanceImplement CI/CD processes for data engineering workflows with comprehensive testing and reliable deployment proceduresEnsure high availability and disaster recovery for critical data systems to maintain business continuityPerformance & Optimization Monitor and optimize streaming performance with focus on latency reduction and operational efficiencyImplement efficient data storage strategies including compression, partitioning, and lifecycle management with cost considerationsTroubleshoot and resolve complex data streaming issues in production environments with effective response protocolsConduct proactive capacity planning and performance tuning to support business growth and data volume increasesCollaboration & Leadership Work closely with data scientists, analysts, and product teams to understand important data requirements and service level expectationsMentor junior data engineers with emphasis on data quality best practices and customer-focused approachParticipate in architectural reviews and help establish data engineering standards that prioritize reliability and accuracyDocument technical designs, processes, and operational procedures with focus on maintainability and knowledge sharingRequired QualificationsExperience & Education Bachelor's or Master's degree in Computer Science, Engineering, or related technical field7+ years (Senior) of hands-on data engineering experienceProven experience with large-scale data processing systems (preferably in fintech/payments domain)Experience building and maintaining production data streams processing TB/PB scale data with strong performance and reliability standardsTechnical Skills & RequirementsProgramming Languages: Expert-level proficiency in both Python and Java; experience with Scala preferredBig Data Technologies: Apache Spark (PySpark, Spark SQL, Spark with Java), Apache Kafka, Apache AirflowCloud Platforms: AWS (EMR, Glue, Redshift, S3, Lambda) or equivalent Azure/GCP servicesDatabases: Strong SQL skills, experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, Redis)Data Quality Management: Deep understanding of the 4 C's framework - Completeness, Consistency, Conformity, and CorrectnessData Governance: Experience with data lineage tracking, metadata management, and data catalogingData Formats & Protocols: Parquet, Avro, JSON, REST APIs, GraphQLContainerization & DevOps: Docker, Kubernetes, Git, GitLab/GitHub with CI/CD pipeline experienceMonitoring & Observability: Experience with Prometheus, Grafana, or similar monitoring toolsData Modeling: Dimensional modeling, data vault, or similar methodologiesStreaming Technologies: Apache Flink, Kinesis, or Pulsar experience is a plusInfrastructure as Code: Terraform, CloudFormation (preferred)Java-specific: Spring Boot, Maven/Gradle, JUnit for building robust data servicesPreferred QualificationsDomain Expertise Previous experience in fintech, payments, or banking industry with solid understanding of regulatory compliance and financial data requirementsUnderstanding of financial data standards, PCI DSS compliance, and data privacy regulations where compliance is essential for business operationsExperience with real-time fraud detection or risk management systems where data accuracy is crucial for customer protectionAdvanced Technical Skills (Preferred) Experience building automated data quality frameworks covering all 4 C's dimensionsKnowledge of machine learning stream orchestration (MLflow, Kubeflow)Familiarity with data mesh or federated data architecture patternsExperience with change data capture (CDC) tools and techniquesLeadership & Soft Skills Strong problem-solving abilities with experience debugging complex distributed systems in production environmentsExcellent communication skills with ability to explain technical concepts to diverse stakeholders while highlighting business valueExperience mentoring team members and leading technical initiatives with focus on building a quality-oriented cultureProven track record of delivering projects successfully in dynamic, fast-paced financial technology environmentsWhat We Offer Opportunity to work with cutting-edge technology at scaleCompetitive salary and equity compensationComprehensive health and wellness benefitsProfessional development opportunities and conference attendanceFlexible working arrangementsChance to impact millions of users across India's digital payments ecosystemApplication Process Interested candidates should submit:Updated resume highlighting relevant data engineering experience with emphasis on real-time systems and data qualityPortfolio or GitHub profile showcasing data engineering projects, particularly those involving high-throughput streaming systemsCover letter explaining interest in fintech/payments domain and understanding of data criticality in financial servicesReferences from previous technical managers or senior colleagues who can attest to your data quality standards
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer / Data Engineer

Kochi, Kerala Invokhr

Posted today

Job Viewed

Tap Again To Close

Job Description

LOOKING FOR IMMEDIATE JOINERS OR 15 DAYS NOTICE PERIODS AND THIS IS WORK FROM HOME OPPORTUNITY

Position: Senior Data Engineer / Data Engineer

Desired Experience: 3-8 years

Salary: Best-in-industry

You will act as a key member of the Data consulting team, working directly with the partners and senior

stakeholders of the clients designing and implementing big data and analytics solutions. Communication

and organisation skills are keys for this position, along with a problem-solution attitude.

What is in it for you:

Opportunity to work with a world class team of business consultants and engineers solving some of

the most complex business problems by applying data and analytics techniques

Fast track career growth in a highly entrepreneurial work environment

Best-in-industry renumeration package

Essential Technical Skills:

Technical expertise with emerging Big Data technologies, such as: Python, Spark, Hadoop, Clojure,

Git, SQL and Databricks; and visualization tools: Tableau and PowerBI

Experience with cloud, container and micro service infrastructures

Experience working with divergent data sets that meet the requirements of the Data Science and

Data Analytics teams

Hands-on experience with data modelling, query techniques and complexity analysis

Desirable Skills:

Experience/Knowledge of working in an agile environment and experience with agile

methodologies such as Scrum

Experience of working with development teams and product owners to understand their

requirement

Certifications on any of the above areas will be preferred.

Your duties will include:

Develop data solutions within a Big Data Azure and/or other cloud environments

Working with divergent data sets that meet the requirements of the Data Science and Data Analytics

teams

Build and design Data Architectures using Azure Data factory, Databricks, Data lake, Synapse

Liaising with CTO, Product Owners and other Operations teams to deliver engineering roadmaps

showing key items such as upgrades, technical refreshes and new versions

Perform data mapping activities to describe source data, target data and the high-level or

detailed transformations that need to occur;

Assist Data Analyst team in developing KPIs and reporting in tools viz. Power BI, Tableau

Data Integration, Transformation, Modelling

Maintaining all relevant documentation and knowledge bases

Research and suggest new database products, services and protocols

Essential Personal Traits:

You should be able to work independently and communicate effectively with remote teams.

Timely communication/escalation of issues/dependencies to higher management.

Curiosity to learn and apply emerging technologies to solve business problems

This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer / Data Engineer

Gurugram, Uttar Pradesh Invokhr

Posted today

Job Viewed

Tap Again To Close

Job Description

Desired Experience: 3-8 years

Salary: Best-in-industry

Location: Gurgaon ( 5 days onsite)


Overview:

You will act as a key member of the Data consulting team, working directly with the partners and senior stakeholders of the clients designing and implementing big data and analytics solutions. Communication and organisation skills are keys for this position, along with a problem-solution attitude.

What is in it for you:

Opportunity to work with a world class team of business consultants and engineers solving some of the most complex business problems by applying data and analytics techniques

Fast track career growth in a highly entrepreneurial work environment

Best-in-industry renumeration package

Essential Technical Skills:

Technical expertise with emerging Big Data technologies, such as: Python, Spark, Hadoop, Clojure, Git, SQL and Databricks; and visualization tools: Tableau and PowerBI

Experience with cloud, container and micro service infrastructures

Experience working with divergent data sets that meet the requirements of the Data Science and Data Analytics teams

Hands-on experience with data modelling, query techniques and complexity analysis

Desirable Skills:

Experience/Knowledge of working in an agile environment and experience with agile methodologies such as Scrum

Experience of working with development teams and product owners to understand their requirement

Certifications on any of the above areas will be preferred.

Your duties will include:

Develop data solutions within a Big Data Azure and/or other cloud environments

Working with divergent data sets that meet the requirements of the Data Science and Data Analytics teams

Build and design Data Architectures using Azure Data factory, Databricks, Data lake, Synapse

Liaising with CTO, Product Owners and other Operations teams to deliver engineering roadmaps showing key items such as upgrades, technical refreshes and new versions

Perform data mapping activities to describe source data, target data and the high-level or detailed transformations that need to occur;

Assist Data Analyst team in developing KPIs and reporting in tools viz. Power BI, Tableau

Data Integration, Transformation, Modelling

Maintaining all relevant documentation and knowledge bases

Research and suggest new database products, services and protocols

Essential Personal Traits:

You should be able to work independently and communicate effectively with remote teams.

Timely communication/escalation of issues/dependencies to higher management.

Curiosity to learn and apply emerging technologies to solve business problems


** Interested candidate please send thier resume on - and **

This advertiser has chosen not to accept applicants from your region.

Data Engineer

Gurugram, Uttar Pradesh United Airlines

Posted today

Job Viewed

Tap Again To Close

Job Description

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network.
Come join us to create what's next. Let's define tomorrow, together.
**Description**
United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions.
Find your future at United! We're reinventing what our industry looks like, and what an airline can be - from the planes we fly to the people who fly them. When you join us, you're joining a global team of 100,000+ connected by a shared passion with a wide spectrum of experience and skills to lead the way forward.
Achieving our ambitions starts with supporting yours. Evolve your career and find your next opportunity. Get the care you need with industry-leading health plans and best-in-class programs to support your emotional, physical, and financial wellness. Expand your horizons with travel across the world's biggest route network. Connect outside your team through employee-led Business Resource Groups.
Create what's next with us. Let's define tomorrow together.
**Job overview and responsibilities**
Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus.
· Data Engineer will be responsible to partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth
· Design, develop, and implement streaming and near-real time data pipelines that feed systems that are the operational backbone of our business
· Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution.
· This role will also drive the adoption of data processing and analysis within the Hadoop environment and help cross train other members of the team.
· Leverage strategic and analytical skills to understand and solve customer and business centric questions
· Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners
· Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business
· Develop and implement innovative solutions leading to automation
· Use of Agile methodologies to manage projects
**This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc.**
**Qualifications**
**What's needed to succeed (Minimum Qualifications):**
· BS/BA in Computer Science or a related STEM field.
· 2+ years in IT/software development with hands-on expertise in Java, Python, Scala.
· 2+ years with Big Data technologies (PySpark, Hadoop, Hive, Kafka, Airflow).
· 2+ years with RDBMS and Data Warehouse (SQL Server, Oracle, Redshift).
· Experience/understanding of LLMs and LangChain/Lang Graph frameworks.
· Problem-solving mindset with curiosity, creativity, and attention to detail.
· Legally authorized to work in India without sponsorship.
· Fluent in English and Hindi (written and spoken).
**What will help you propel from the pack (Preferred Qualifications):**
· Master's degree in computer science or related STEM field.
· Experience with **cloud platforms** (AWS, Azure, GCP); AWS certification preferred.
· Understanding of ML frameworks and LLM-based solutions.
· Proficiency in SQL, including performance tuning and troubleshooting.
· Experience in **transportation/airline industry** data engineering.
· Proven success with Agile practices, CI/CD, and scalable architectures.
· Strong problem-solving skills with deep knowledge of Big Data ecosystems.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Hyderabad, Andhra Pradesh Kyndryl

Posted today

Job Viewed

Tap Again To Close

Job Description

**Who We Are**
At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities.
**The Role**
Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips.
As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation.
In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation.
Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset-a true data alchemist.
Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made - and your lifecycle management expertise will ensure our data remains fresh and impactful.
So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth.
Your Future at Kyndryl
Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here.
**Who You Are**
You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others.
Required Technical and Professional Expertise
+ 6-8 years of experience working as a Data Engineer or in Azure cloud modernization
+ Good experience in Power BI for data visualization and dashboard development
+ Strong experience in developing data pipelines and using tools such as AWS Glue, Azure Databricks, Synapse, or Google Dataproc
+ Proficient in working with both relational and NoSQL databases, including PostgreSQL, DB2, and MongoDB
+ Excellent problem-solving, analytical, and critical thinking skills
+ Ability to manage multiple projects simultaneously while maintaining a high level of attention to detail
+ Expertise in data mining, data storage, and Extract-Transform-Load (ETL) processes
Preferred Technical and Professional Experience
+ Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes
+ Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization
+ Cloud platform certification, e.g., AWS Certified Data Analytics - Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate
+ Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio
+ Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology
**Being You**
Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way.
**What You Can Expect**
With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed.
**Get Referred!**
If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address.
Kyndryl is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Kyndryl is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Bangalore, Karnataka Kyndryl

Posted today

Job Viewed

Tap Again To Close

Job Description

**Who We Are**
At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities.
**The Role**
Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips.
As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation.
In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation.
Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset-a true data alchemist.
Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made - and your lifecycle management expertise will ensure our data remains fresh and impactful.
So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth.
Your Future at Kyndryl
Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here.
**Who You Are**
You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others.
Required Technical and Professional Expertise
+ 6-8 years of experience working as a Data Engineer or in Azure cloud modernization
+ Good experience in Power BI for data visualization and dashboard development
+ Strong experience in developing data pipelines and using tools such as AWS Glue, Azure Databricks, Synapse, or Google Dataproc
+ Proficient in working with both relational and NoSQL databases, including PostgreSQL, DB2, and MongoDB
+ Excellent problem-solving, analytical, and critical thinking skills
+ Ability to manage multiple projects simultaneously while maintaining a high level of attention to detail
+ Expertise in data mining, data storage, and Extract-Transform-Load (ETL) processes
Preferred Technical and Professional Experience
+ Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes
+ Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization
+ Cloud platform certification, e.g., AWS Certified Data Analytics - Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate
+ Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio
+ Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology
**Being You**
Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way.
**What You Can Expect**
With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed.
**Get Referred!**
If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address.
Kyndryl is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Kyndryl is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in India !

Data Engineer

Pune, Maharashtra Cummins Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

**DESCRIPTION**
**Note:- Although the role category specified in the GPP is Remote, the requirement is for Hybrid working model from Cummins Pune Office.**
**Job Summary:**
Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale.
**Key Responsibilities:**
Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application.
**RESPONSIBILITIES**
**Competencies:**
System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts.
Collaborates - Building partnerships and working collaboratively with others to meet shared objectives.
Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences.
Customer focus - Building strong customer relationships and delivering customer-centric solutions.
Decision quality - Making good and timely decisions that keep the organization moving forward.
Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies.
Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements.
Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product.
Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning.
Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements.
Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making.
Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented.
Values differences - Recognizing the value that different perspectives and cultures bring to an organization.
**Education, Licenses, Certifications:**
College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations.
**Experience:**
4-5 Years of experience.
Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities.
Knowledge of the latest technologies in data engineering is highly preferred and includes:
- Exposure to Big Data open source
- SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework
- SQL query language
- Clustered compute cloud-based implementation experience
- Familiarity developing applications requiring large file movement for a Cloud-based environment
- Exposure to Agile software development
- Exposure to building analytical solutions
- Exposure to IoT technology
**QUALIFICATIONS**
1) Work closely with business Product Owner to understand product vision.
2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake).
3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards.
4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake.
5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers.
6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers.
7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision.
8) Assist to resolve issues that compromise data accuracy and usability.
1. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala.
2. Database Management: Intermediate level expertise in SQL and NoSQL databases.
3. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks.
4. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms.
5. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes.
6. API: Working knowledge of API to consume data from ERP, CRM
**Job** Systems/Information Technology
**Organization** Cummins Inc.
**Role Category** Remote
**Job Type** Exempt - Experienced
**ReqID**
**Relocation Package** Yes
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Chennai, Tamil Nadu Citigroup

Posted today

Job Viewed

Tap Again To Close

Job Description

**The Role**
The Data Engineer is accountable for developing high quality data products to support the Bank's regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team.
**Responsibilities**
+ Developing and supporting scalable, extensible, and highly available data solutions
+ Deliver on critical business priorities while ensuring alignment with the wider architectural vision
+ Identify and help address potential risks in the data supply chain
+ Follow and contribute to technical standards
+ Design and develop analytical data models
**Required Qualifications & Work Experience**
+ First Class Degree in Engineering/Technology (4-year graduate course)
+ 3 to 4 years' experience implementing data-intensive solutions using agile methodologies
+ Experience of relational databases and using SQL for data querying, transformation and manipulation
+ Experience of modelling data for analytical consumers
+ Ability to automate and streamline the build, test and deployment of data pipelines
+ Experience in cloud native technologies and patterns
+ A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
+ Excellent communication and problem-solving skills
**T** **echnical Skills (Must Have)**
+ **ETL:** Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
+ **Big Data** :Exposure to 'big data' platforms such as Hadoop, Hive or Snowflake for data storage and processing
+ **Data Warehousing & Database Management** : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design
+ **Data Modeling & Design** : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures
+ **Languages** : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala
+ **DevOps** : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management
**Technical Skills (Valuable)**
+ **Ab Initio** : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows
+ **Cloud** : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs
+ **Data Quality & Controls** : Exposure to data validation, cleansing, enrichment and data controls
+ **Containerization** : Fair understanding of containerization platforms like Docker, Kubernetes
+ **File Formats** : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta
+ **Others** : Basics of Job scheduler like Autosys. Basics of Entitlement management
Certification on any of the above topics would be an advantage.
---
**Job Family Group:**
Technology
---
**Job Family:**
Digital Software Engineering
---
**Time Type:**
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Pune, Maharashtra Citigroup

Posted today

Job Viewed

Tap Again To Close

Job Description

**The Role**
The Data Engineer is accountable for developing high quality data products to support the Bank's regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team.
**Responsibilities**
+ Developing and supporting scalable, extensible, and highly available data solutions
+ Deliver on critical business priorities while ensuring alignment with the wider architectural vision
+ Identify and help address potential risks in the data supply chain
+ Follow and contribute to technical standards
+ Design and develop analytical data models
**Required Qualifications & Work Experience**
+ First Class Degree in Engineering/Technology (4-year graduate course)
+ 5 to 8 years' experience implementing data-intensive solutions using agile methodologies
+ Experience of relational databases and using SQL for data querying, transformation and manipulation
+ Experience of modelling data for analytical consumers
+ Ability to automate and streamline the build, test and deployment of data pipelines
+ Experience in cloud native technologies and patterns
+ A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
+ Excellent communication and problem-solving skills
**T** **echnical Skills (Must Have)**
+ **ETL:** Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
+ **Big Data** : Experience of 'big data' platforms such as Hadoop, Hive or Snowflake for data storage and processing
+ **Data Warehousing & Database Management** : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design
+ **Data Modeling & Design** : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures
+ **Languages** : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala
+ **DevOps** : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management
**Technical Skills (Valuable)**
+ **Ab Initio** : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows
+ **Cloud** : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs
+ **Data Quality & Controls** : Exposure to data validation, cleansing, enrichment and data controls
+ **Containerization** : Fair understanding of containerization platforms like Docker, Kubernetes
+ **File Formats** : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta
+ **Others** : Basics of Job scheduler like Autosys. Basics of Entitlement management
+ Certification on any of the above topics would be an advantage.
---
**Job Family Group:**
Technology
---
**Job Family:**
Digital Software Engineering
---
**Time Type:**
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineer Jobs