21,558 Cloud Data Engineer jobs in India

Cloud Data Engineer

CAI

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Cloud Data Engineer
**Req number:**
R5934
**Employment type:**
Full time
**Worksite flexibility:**
Remote
**Who we are**
CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right-whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise.
**Job Summary**
We are seeking a motivated Cloud Data Engineer that has experience in building data products using Databricks and related technologies. This is a Full-time and Remote position.
**Job Description**
**What You'll Do**
+ Analyze and understand existing data warehouse implementations to support migration and consolidation efforts.
+ Reverse-engineer legacy stored procedures (PL/SQL, SQL) and translate business logic into scalable Spark SQL code within Databricks notebooks.
+ Design and develop data lake solutions on AWS using S3 and Delta Lake architecture, leveraging Databricks for processing and transformation.
+ Build and maintain robust data pipelines using ETL tools with ingestion into S3 and processing in Databricks.
+ Collaborate with data architects to implement ingestion and transformation frameworks aligned with enterprise standards.
+ Evaluate and optimize data models (Star, Snowflake, Flattened) for performance and scalability in the new platform.
+ Document ETL processes, data flows, and transformation logic to ensure transparency and maintainability.
+ Perform foundational data administration tasks including job scheduling, error troubleshooting, performance tuning, and backup coordination.
+ Work closely with cross-functional teams to ensure smooth transition and integration of data sources into the unified platform.
+ Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and backlog grooming.
+ Triage, debug and fix technical issues related to Data Lakes.
+ Maintain and Manage Code repositories like Git.
**What You'll Need**
+ 5+ years of experience working with **Databricks** , including Spark SQL and Delta Lake implementations.
+ 3 + years of experience in designing and implementing data lake architectures on Databricks.
+ Strong SQL and PL/SQL skills with the ability to interpret and refactor legacy stored procedures.
+ Hands-on experience with data modeling and warehouse design principles.
+ Proficiency in at least one programming language (Python, Scala, Java).
+ Bachelor's degree in Computer Science, Information Technology, Data Engineering, or related field.
+ Experience working in Agile environments and contributing to iterative development cycles. Experience working on Agile projects and Agile methodology in general.
+ Databricks cloud certification is a big plus.
+ Exposure to enterprise data governance and metadata management practices.
**Physical Demands**
+ This role involves mostly sedentary work, with occasional movement around the office to attend meetings, etc.
+ Ability to perform repetitive tasks on a computer, using a mouse, keyboard, and monitor.
**Reasonable accommodation statement**
If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to or (888) 824 - 8111.
This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Chennai, Tamil Nadu Giggso

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

• Design, develop, and maintain cloud-based solutions on Azure or AWS.

• Implement and manage real-time data streaming and messaging systems using Kafka.

• Develop scalable applications and services using Java and Python.

• Deploy, manage, and monitor containerized applications using Kubernetes.

• Build and optimize big data processing pipelines using Databricks.

• Manage and maintain databases, including SQL Server and Snowflake, and write

complex SQL scripts.

• Work with Unix/Linux commands to manage and monitor system operations.

• Collaborate with cross-functional teams to ensure seamless integration of cloud-based

solutions.


Key Skills:

• Expertise in Azure or AWS cloud platforms.

• Proficiency in Kafka, Java, Python, and Kubernetes.

• Hands-on experience with Databricks for big data processing.

• Strong database management skills with SQL Server, Snowflake, and advanced SQL

scripting.

• Solid understanding of Unix/Linux commands.


General Requirements for Both Off-Shore Roles:

• Bachelor’s degree in computer science, Engineering, or a related field (or equivalent

experience).

• 5+ years of experience in cloud and data engineering roles.

• Strong problem-solving and analytical skills.

• Excellent communication and collaboration abilities.

• Proven ability to work in a fast-paced, agile environment.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Chennai, Tamil Nadu Giggso

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

• Design, develop, and maintain cloud-based solutions on Azure or AWS.

• Implement and manage real-time data streaming and messaging systems using Kafka.

• Develop scalable applications and services using Java and Python.

• Deploy, manage, and monitor containerized applications using Kubernetes.

• Build and optimize big data processing pipelines using Databricks.

• Manage and maintain databases, including SQL Server and Snowflake, and write

complex SQL scripts.

• Work with Unix/Linux commands to manage and monitor system operations.

• Collaborate with cross-functional teams to ensure seamless integration of cloud-based

solutions.

Key Skills:

• Expertise in Azure or AWS cloud platforms.

• Proficiency in Kafka, Java, Python, and Kubernetes.

• Hands-on experience with Databricks for big data processing.

• Strong database management skills with SQL Server, Snowflake, and advanced SQL

scripting.

• Solid understanding of Unix/Linux commands.

General Requirements for Both Off-Shore Roles:

• Bachelor’s degree in computer science, Engineering, or a related field (or equivalent

experience).

• 5+ years of experience in cloud and data engineering roles.

• Strong problem-solving and analytical skills.

• Excellent communication and collaboration abilities.

• Proven ability to work in a fast-paced, agile environment.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Chennai, Tamil Nadu Giggso

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:
• Design, develop, and maintain cloud-based solutions on Azure or AWS.
• Implement and manage real-time data streaming and messaging systems using Kafka.
• Develop scalable applications and services using Java and Python.
• Deploy, manage, and monitor containerized applications using Kubernetes.
• Build and optimize big data processing pipelines using Databricks.
• Manage and maintain databases, including SQL Server and Snowflake, and write
complex SQL scripts.
• Work with Unix/Linux commands to manage and monitor system operations.
• Collaborate with cross-functional teams to ensure seamless integration of cloud-based
solutions.

Key Skills:
• Expertise in Azure or AWS cloud platforms.
• Proficiency in Kafka, Java, Python, and Kubernetes.
• Hands-on experience with Databricks for big data processing.
• Strong database management skills with SQL Server, Snowflake, and advanced SQL
scripting.
• Solid understanding of Unix/Linux commands.

General Requirements for Both Off-Shore Roles:
• Bachelor’s degree in computer science, Engineering, or a related field (or equivalent
experience).
• 5+ years of experience in cloud and data engineering roles.
• Strong problem-solving and analytical skills.
• Excellent communication and collaboration abilities.
• Proven ability to work in a fast-paced, agile environment.
This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Chennai, Tamil Nadu Giggso

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

• Design, develop, and maintain cloud-based solutions on Azure or AWS.

• Implement and manage real-time data streaming and messaging systems using Kafka.

• Develop scalable applications and services using Java and Python.

• Deploy, manage, and monitor containerized applications using Kubernetes.

• Build and optimize big data processing pipelines using Databricks.

• Manage and maintain databases, including SQL Server and Snowflake, and write

complex SQL scripts.

• Work with Unix/Linux commands to manage and monitor system operations.

• Collaborate with cross-functional teams to ensure seamless integration of cloud-based

solutions.


Key Skills:

• Expertise in Azure or AWS cloud platforms.

• Proficiency in Kafka, Java, Python, and Kubernetes.

• Hands-on experience with Databricks for big data processing.

• Strong database management skills with SQL Server, Snowflake, and advanced SQL

scripting.

• Solid understanding of Unix/Linux commands.


General Requirements for Both Off-Shore Roles:

• Bachelor’s degree in computer science, Engineering, or a related field (or equivalent

experience).

• 5+ years of experience in cloud and data engineering roles.

• Strong problem-solving and analytical skills.

• Excellent communication and collaboration abilities.

• Proven ability to work in a fast-paced, agile environment.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Cai

Posted today

Job Viewed

Tap Again To Close

Job Description

Cloud Data Engineer

Req number:

R5934

Employment type:

Full time

Worksite flexibility:

Remote Who we are

CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise.

Job Summary

We are seeking a motivated Cloud Data Engineer that has experience in building data products using Databricks and related technologies. This is a Full-time and Remote position.

Job Description

What You’ll Do

  • Analyze and understand existing data warehouse implementations to support migration and consolidation efforts.
  • Reverse-engineer legacy stored procedures (PL/SQL, SQL) and translate business logic into scalable Spark SQL code within Databricks notebooks.
  • Design and develop data lake solutions on AWS using S3 and Delta Lake architecture, leveraging Databricks for processing and transformation.
  • Build and maintain robust data pipelines using ETL tools with ingestion into S3 and processing in Databricks.
  • Collaborate with data architects to implement ingestion and transformation frameworks aligned with enterprise standards.
  • Evaluate and optimize data models (Star, Snowflake, Flattened) for performance and scalability in the new platform.
  • Document ETL processes, data flows, and transformation logic to ensure transparency and maintainability.
  • Perform foundational data administration tasks including job scheduling, error troubleshooting, performance tuning, and backup coordination.
  • Work closely with cross-functional teams to ensure smooth transition and integration of data sources into the unified platform.
  • Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and backlog grooming.
  • Triage, debug and fix technical issues related to Data Lakes.
  • Maintain and Manage Code repositories like Git.

What You'll Need

  • 5+ years of experience working with Databricks , including Spark SQL and Delta Lake implementations.
  • 3 + years of experience in designing and implementing data lake architectures on Databricks.
  • Strong SQL and PL/SQL skills with the ability to interpret and refactor legacy stored procedures.
  • Hands-on experience with data modeling and warehouse design principles.
  • Proficiency in at least one programming language (Python, Scala, Java).
  • Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or related field.
  • Experience working in Agile environments and contributing to iterative development cycles. Experience working on Agile projects and Agile methodology in general.
  • Databricks cloud certification is a big plus.
  • Exposure to enterprise data governance and metadata management practices.

Physical Demands

  • This role involves mostly sedentary work, with occasional movement around the office to attend meetings, etc.
  • Ability to perform repetitive tasks on a computer, using a mouse, keyboard, and monitor.

Reasonable accommodation statement

If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to or (888) 824 – 8111.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Punjabi Bagh, Delhi Confidential

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

Design and Development :

  • Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.).
  • Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP).
  • Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse.
  • Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others.
  • Develop and optimize data processing jobs using Spark Scala.

Data Integration and Management :

  • Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake.
  • Ensure data quality and integrity through rigorous testing and validation.
  • Perform data extraction from SAP or ERP systems when necessary.

Performance Optimization:

  • Monitor and optimize the performance of data pipelines and ETL processes.
  • Implement best practices for data management, including data governance, security, and compliance.

Collaboration and Communication :

  • Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Collaborate with cross-functional teams to design and implement data solutions that meet business needs.

Documentation and Maintenance:

  • Document technical solutions, processes, and workflows.
  • Maintain and troubleshoot existing ETL pipelines and data integrations.

Education:

  • Bachelor s degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus.

Experience:

  • 7+ years of experience as a Data Engineer or in a similar role.
  • Proven experience with cloud platforms: AWS, Azure, and GCP.
  • Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.
  • Experience with other ETL tools like Informatica, SAP Data Intelligence, etc.
  • Experience in building and managing data lakes and data warehouses.
  • Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse.
  • Experience with data extraction from SAP or ERP systems is a plus.
  • Strong experience with Spark and Scala for data processing.

Skills:

  • Strong programming skills in Python, Java, or Scala.
  • Proficient in SQL and query optimization techniques.
  • Familiarity with data modeling, ETL/ELT processes, and data warehousing concepts.
  • Knowledge of data governance, security, and compliance best practices.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.

Preferred Qualifications:

  • Experience with other data tools and technologies such as Apache Spark, or Hadoop.
  • Certifications in cloud platforms (AWS Certified Data Analytics - Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate).
  • Experience with CI/CD pipelines and DevOps practices for data engineering
  • Selected applicant will be subject to a background investigation, which will be conducted and the results of which will be used in compliance with applicable law.

Skills Required
Apache Spark, Hadoop, Devops, Sql, Programming Skills
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Cloud data engineer Jobs in India !

Cloud Data Engineer

Hyderabad, Andhra Pradesh Confidential

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:
  • Design and develop scalable ETL/ELT pipelines using Dataflow, Dataproc, Apache Beam, Cloud Composer (Airflow) , and other GCP services.
  • Build and maintain data lakes and data warehouses using BigQuery , Cloud Storage , and Cloud SQL/Spanner .
  • Implement and optimize data ingestion from a variety of structured and unstructured sources using GCP-native tools and APIs.
  • Work with Pub/Sub , Cloud Functions , and Eventarc for real-time data processing and streaming pipelines.
  • Ensure data governance , quality , and security using best practices for IAM, encryption, data cataloging, and auditing.
  • Collaborate with stakeholders to gather requirements, define data models, and deliver insights via BI tools.
  • Automate workflows and monitoring using tools like Cloud Monitoring , Logging , and Terraform or Deployment Manager .
  • Stay current with the latest GCP tools, features, and trends in cloud data engineering.
Qualifications and Requirements:
  • Bachelor's degree in Computer Science, Data Engineering, or related field.
  • 3+ years of experience in data engineering, including at least 1–2 years on Google Cloud Platform .
  • Proficiency with SQL , Python , and/or Java for data processing and scripting.
  • Hands-on experience with GCP services like:
  • BigQuery
  • Cloud Storage
  • Dataflow / Dataproc
  • Pub/Sub
  • Composer
  • Cloud Functions
  • Solid understanding of data modeling, partitioning, performance tuning, and schema design.
  • Experience with DevOps practices and CI/CD pipelines for data projects.

Skills Required
AWS Devops, BigQuery, Python, Sql, Java
This advertiser has chosen not to accept applicants from your region.

Cloud data engineer

Gurugram, Uttar Pradesh Epergne Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title : Cloud Data Engineer

Experience : 4 to 5 Years

Location : Noida / Gurgaon / Hyderabad / Bangalore / Pune

Notice Period : Immediate (15 Days notice period)


Responsibilities :
  • Design, develop, and maintain automated data pipelines and ETL processes .
  • Create and optimize data tables and views in Snowflake .
  • Work with datasets across Azure , AWS , and various structured/unstructured formats.
  • Ensure data security, privacy, and compliance with industry and organizational standards (e.g., BHP standards).
  • Support and mentor junior team members by providing guidance on data modelling, data management, and data engineering best practices .


Required Skills & Experience :
  • Strong hands-on experience in data pipeline development , ETL scripting using Python , and handling data formats like JSON, CSV , etc.
  • Proficiency in AWS services such as:
  • AWS Glue
  • AWS Batch
  • AWS Step Functions
  • Experience with Azure services , including:
  • Azure Data Factory
  • Azure Logic Apps
  • Azure Functions
  • Azure Blob Storage
  • Solid understanding of:
  • Data management principles
  • Data structures & storage solutions
  • Data modeling techniques
  • Strong programming skills in:
  • Python (with OOP concepts)
  • PowerShell
  • Bash scripting
  • Advanced SQL skills, including writing and optimizing complex queries .
  • Working experience with Terraform and CI/CD pipelines for infrastructure and deployment automation.

This advertiser has chosen not to accept applicants from your region.

Cloud data engineer

Pune, Maharashtra Epergne Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title : Cloud Data Engineer

Experience : 4 to 5 Years

Location : Noida / Gurgaon / Hyderabad / Bangalore / Pune

Notice Period : Immediate (15 Days notice period)


Responsibilities :
  • Design, develop, and maintain automated data pipelines and ETL processes .
  • Create and optimize data tables and views in Snowflake .
  • Work with datasets across Azure , AWS , and various structured/unstructured formats.
  • Ensure data security, privacy, and compliance with industry and organizational standards (e.g., BHP standards).
  • Support and mentor junior team members by providing guidance on data modelling, data management, and data engineering best practices .


Required Skills & Experience :
  • Strong hands-on experience in data pipeline development , ETL scripting using Python , and handling data formats like JSON, CSV , etc.
  • Proficiency in AWS services such as:
  • AWS Glue
  • AWS Batch
  • AWS Step Functions
  • Experience with Azure services , including:
  • Azure Data Factory
  • Azure Logic Apps
  • Azure Functions
  • Azure Blob Storage
  • Solid understanding of:
  • Data management principles
  • Data structures & storage solutions
  • Data modeling techniques
  • Strong programming skills in:
  • Python (with OOP concepts)
  • PowerShell
  • Bash scripting
  • Advanced SQL skills, including writing and optimizing complex queries .
  • Working experience with Terraform and CI/CD pipelines for infrastructure and deployment automation.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Cloud Data Engineer Jobs