2,667 Cloud Data jobs in India

Informatica Cloud Data Management Administrator

Mumbai, Maharashtra Confidential

Posted today

Job Viewed

Tap Again To Close

Job Description

  • We are seeking an experienced Informatica Cloud Data Management Administrator to manage, maintain, and support Informatica Cloud environments. The candidate will be responsible for ensuring the smooth operation of cloud data integration processes, monitoring system performance, troubleshooting issues, and collaborating with cross-functional teams to implement data management solutions.
Key Responsibilities:
  • Administer and monitor Informatica Intelligent Cloud Services (IICS) environments to ensure optimal performance and availability.
  • Manage user roles, permissions, and security configurations within the Informatica Cloud platform.
  • Schedule, monitor, and troubleshoot data integration jobs and workflows.
  • Collaborate with developers and data architects to support the design and deployment of cloud data integration solutions.
  • Maintain and optimize data mappings, transformations, and workflows in Informatica Cloud.
  • Implement and enforce best practices for cloud data management, data quality, and metadata management.
  • Handle incident management and provide timely resolution for production issues.
  • Perform platform upgrades, patches, and configuration changes as needed.
  • Maintain documentation for system configurations, processes, and procedures.
  • Provide support and training to end-users and stakeholders.
  • Stay current with Informatica Cloud platform updates, features, and industry trends.
Required Skills & Qualifications:
  • 4 to 6 years of experience working with Informatica Cloud Data Management or Informatica Intelligent Cloud Services (IICS) administration.
  • Strong knowledge of cloud data integration concepts, ETL/ELT processes, and best practices.
  • Experience with Informatica Cloud user and security management.
  • Ability to monitor and troubleshoot data workflows and performance issues.
  • Familiarity with scheduling tools, job monitoring, and alert management.
  • Experience working with REST APIs, cloud connectors, and data sources such as databases, SaaS applications, and cloud storage.
  • Good understanding of data quality and metadata management principles.
  • Strong problem-solving and communication skills.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud is a plus.
  • Knowledge of scripting or automation tools is an advantage.
Preferred Skills:
  • Informatica Cloud certifications.
  • Experience with hybrid cloud environments (on-premise + cloud).
  • Knowledge of data governance and compliance requirements.
  • Familiarity with Agile development methodologies.

Skills Required
Aws, Azure, Google Cloud, Rest Apis, Etl, ELT
This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

CAI

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Cloud Data Engineer
**Req number:**
R5934
**Employment type:**
Full time
**Worksite flexibility:**
Remote
**Who we are**
CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right-whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise.
**Job Summary**
We are seeking a motivated Cloud Data Engineer that has experience in building data products using Databricks and related technologies. This is a Full-time and Remote position.
**Job Description**
**What You'll Do**
+ Analyze and understand existing data warehouse implementations to support migration and consolidation efforts.
+ Reverse-engineer legacy stored procedures (PL/SQL, SQL) and translate business logic into scalable Spark SQL code within Databricks notebooks.
+ Design and develop data lake solutions on AWS using S3 and Delta Lake architecture, leveraging Databricks for processing and transformation.
+ Build and maintain robust data pipelines using ETL tools with ingestion into S3 and processing in Databricks.
+ Collaborate with data architects to implement ingestion and transformation frameworks aligned with enterprise standards.
+ Evaluate and optimize data models (Star, Snowflake, Flattened) for performance and scalability in the new platform.
+ Document ETL processes, data flows, and transformation logic to ensure transparency and maintainability.
+ Perform foundational data administration tasks including job scheduling, error troubleshooting, performance tuning, and backup coordination.
+ Work closely with cross-functional teams to ensure smooth transition and integration of data sources into the unified platform.
+ Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and backlog grooming.
+ Triage, debug and fix technical issues related to Data Lakes.
+ Maintain and Manage Code repositories like Git.
**What You'll Need**
+ 5+ years of experience working with **Databricks** , including Spark SQL and Delta Lake implementations.
+ 3 + years of experience in designing and implementing data lake architectures on Databricks.
+ Strong SQL and PL/SQL skills with the ability to interpret and refactor legacy stored procedures.
+ Hands-on experience with data modeling and warehouse design principles.
+ Proficiency in at least one programming language (Python, Scala, Java).
+ Bachelor's degree in Computer Science, Information Technology, Data Engineering, or related field.
+ Experience working in Agile environments and contributing to iterative development cycles. Experience working on Agile projects and Agile methodology in general.
+ Databricks cloud certification is a big plus.
+ Exposure to enterprise data governance and metadata management practices.
**Physical Demands**
+ This role involves mostly sedentary work, with occasional movement around the office to attend meetings, etc.
+ Ability to perform repetitive tasks on a computer, using a mouse, keyboard, and monitor.
**Reasonable accommodation statement**
If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to or (888) 824 - 8111.
This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Chennai, Tamil Nadu Giggso

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

• Design, develop, and maintain cloud-based solutions on Azure or AWS.

• Implement and manage real-time data streaming and messaging systems using Kafka.

• Develop scalable applications and services using Java and Python.

• Deploy, manage, and monitor containerized applications using Kubernetes.

• Build and optimize big data processing pipelines using Databricks.

• Manage and maintain databases, including SQL Server and Snowflake, and write

complex SQL scripts.

• Work with Unix/Linux commands to manage and monitor system operations.

• Collaborate with cross-functional teams to ensure seamless integration of cloud-based

solutions.


Key Skills:

• Expertise in Azure or AWS cloud platforms.

• Proficiency in Kafka, Java, Python, and Kubernetes.

• Hands-on experience with Databricks for big data processing.

• Strong database management skills with SQL Server, Snowflake, and advanced SQL

scripting.

• Solid understanding of Unix/Linux commands.


General Requirements for Both Off-Shore Roles:

• Bachelor’s degree in computer science, Engineering, or a related field (or equivalent

experience).

• 5+ years of experience in cloud and data engineering roles.

• Strong problem-solving and analytical skills.

• Excellent communication and collaboration abilities.

• Proven ability to work in a fast-paced, agile environment.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Architect

Karnataka, Karnataka SJ Group

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description: Cloud Data Architect


Job Overview


We are seeking a highly skilled and motivated Data Architect with 5–7 years of experience in data architecture, modeling, and enterprise data solutions. The ideal candidate will be familiar with Microsoft Fabric or equivalent modern data platforms, and possess hands-on expertise in designing scalable data models and systems. Prior consulting experience and a strong understanding of data integration, governance, and analytics design are highly desirable.


Key Responsibilities


- Design, implement, and maintain enterprise data architectures for analytics and operational use.

- Develop and optimize conceptual, logical, and physical data models that support business reporting and analytics needs.

- Lead end-to-end data solution designs using Microsoft Fabric, Azure Synapse, or similar platforms.

- Translate business and functional requirements into technical data architecture specifications.

- Collaborate with stakeholders, data engineers, and business teams to design scalable, secure, and high-performance data environments.

- Provide data integration strategies for structured data across multiple systems.

- Ensure adherence to data governance, data quality, and security standards in all architecture designs.

- Participate in architecture reviews, technical design sessions, and consulting engagements with internal or external clients.

- Support modernization efforts by guiding migration to cloud-native data platforms.


Required Qualifications


- 5–7 years in data architecture, data engineering, or analytics solution delivery.

- Hands-on experience with Microsoft Fabric , Azure Data Services (Data Lake, Synapse, Data Factory), or equivalent platforms like Snowflake, Databricks.

- Strong data modeling (dimensional and normalized), metadata management, and data lineage understanding.

- Proficiency in Python or R, SQL, DAX, Power BI, and data pipeline tools.

- Solid understanding of modern data warehouse/lakehouse architectures and enterprise data integration patterns.

- Ability to manage stakeholder expectations, translate business needs into technical requirements, and deliver value through data solutions.

- Strong communication and documentation skills.

- Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field.


Preferred Skills


- Experience with tools such as Microsoft Fabric, Azure Data Factory, Databricks, or similar platforms.

- Knowledge of Python or R for data manipulation or ML integration.

- Familiarity with DevOps or CI/CD for BI deployments.

- Exposure to data governance tools (e.g., Purview) and practices.

- Experience working with Agile/Scrum methodologies.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Chennai, Tamil Nadu Giggso

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

• Design, develop, and maintain cloud-based solutions on Azure or AWS.

• Implement and manage real-time data streaming and messaging systems using Kafka.

• Develop scalable applications and services using Java and Python.

• Deploy, manage, and monitor containerized applications using Kubernetes.

• Build and optimize big data processing pipelines using Databricks.

• Manage and maintain databases, including SQL Server and Snowflake, and write

complex SQL scripts.

• Work with Unix/Linux commands to manage and monitor system operations.

• Collaborate with cross-functional teams to ensure seamless integration of cloud-based

solutions.

Key Skills:

• Expertise in Azure or AWS cloud platforms.

• Proficiency in Kafka, Java, Python, and Kubernetes.

• Hands-on experience with Databricks for big data processing.

• Strong database management skills with SQL Server, Snowflake, and advanced SQL

scripting.

• Solid understanding of Unix/Linux commands.

General Requirements for Both Off-Shore Roles:

• Bachelor’s degree in computer science, Engineering, or a related field (or equivalent

experience).

• 5+ years of experience in cloud and data engineering roles.

• Strong problem-solving and analytical skills.

• Excellent communication and collaboration abilities.

• Proven ability to work in a fast-paced, agile environment.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Chennai, Tamil Nadu Giggso

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:
• Design, develop, and maintain cloud-based solutions on Azure or AWS.
• Implement and manage real-time data streaming and messaging systems using Kafka.
• Develop scalable applications and services using Java and Python.
• Deploy, manage, and monitor containerized applications using Kubernetes.
• Build and optimize big data processing pipelines using Databricks.
• Manage and maintain databases, including SQL Server and Snowflake, and write
complex SQL scripts.
• Work with Unix/Linux commands to manage and monitor system operations.
• Collaborate with cross-functional teams to ensure seamless integration of cloud-based
solutions.

Key Skills:
• Expertise in Azure or AWS cloud platforms.
• Proficiency in Kafka, Java, Python, and Kubernetes.
• Hands-on experience with Databricks for big data processing.
• Strong database management skills with SQL Server, Snowflake, and advanced SQL
scripting.
• Solid understanding of Unix/Linux commands.

General Requirements for Both Off-Shore Roles:
• Bachelor’s degree in computer science, Engineering, or a related field (or equivalent
experience).
• 5+ years of experience in cloud and data engineering roles.
• Strong problem-solving and analytical skills.
• Excellent communication and collaboration abilities.
• Proven ability to work in a fast-paced, agile environment.
This advertiser has chosen not to accept applicants from your region.

Cloud Data Architect

Bengaluru, Karnataka SJ Group

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description: Cloud Data Architect

Job Overview

We are seeking a highly skilled and motivated Data Architect with 5–7 years of experience in data architecture, modeling, and enterprise data solutions. The ideal candidate will be familiar with Microsoft Fabric or equivalent modern data platforms, and possess hands-on expertise in designing scalable data models and systems. Prior consulting experience and a strong understanding of data integration, governance, and analytics design are highly desirable.

Key Responsibilities

- Design, implement, and maintain enterprise data architectures for analytics and operational use.
- Develop and optimize conceptual, logical, and physical data models that support business reporting and analytics needs.
- Lead end-to-end data solution designs using Microsoft Fabric, Azure Synapse, or similar platforms.
- Translate business and functional requirements into technical data architecture specifications.
- Collaborate with stakeholders, data engineers, and business teams to design scalable, secure, and high-performance data environments.
- Provide data integration strategies for structured data across multiple systems.
- Ensure adherence to data governance, data quality, and security standards in all architecture designs.
- Participate in architecture reviews, technical design sessions, and consulting engagements with internal or external clients.
- Support modernization efforts by guiding migration to cloud-native data platforms.

Required Qualifications

- 5–7 years in data architecture, data engineering, or analytics solution delivery.
- Hands-on experience with Microsoft Fabric , Azure Data Services (Data Lake, Synapse, Data Factory), or equivalent platforms like Snowflake, Databricks.
- Strong data modeling (dimensional and normalized), metadata management, and data lineage understanding.
- Proficiency in Python or R, SQL, DAX, Power BI, and data pipeline tools.
- Solid understanding of modern data warehouse/lakehouse architectures and enterprise data integration patterns.
- Ability to manage stakeholder expectations, translate business needs into technical requirements, and deliver value through data solutions.
- Strong communication and documentation skills.
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field.

Preferred Skills

- Experience with tools such as Microsoft Fabric, Azure Data Factory, Databricks, or similar platforms.
- Knowledge of Python or R for data manipulation or ML integration.
- Familiarity with DevOps or CI/CD for BI deployments.
- Exposure to data governance tools (e.g., Purview) and practices.
- Experience working with Agile/Scrum methodologies.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Cloud data Jobs in India !

Cloud Data Engineer

Chennai, Tamil Nadu Giggso

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

• Design, develop, and maintain cloud-based solutions on Azure or AWS.

• Implement and manage real-time data streaming and messaging systems using Kafka.

• Develop scalable applications and services using Java and Python.

• Deploy, manage, and monitor containerized applications using Kubernetes.

• Build and optimize big data processing pipelines using Databricks.

• Manage and maintain databases, including SQL Server and Snowflake, and write

complex SQL scripts.

• Work with Unix/Linux commands to manage and monitor system operations.

• Collaborate with cross-functional teams to ensure seamless integration of cloud-based

solutions.


Key Skills:

• Expertise in Azure or AWS cloud platforms.

• Proficiency in Kafka, Java, Python, and Kubernetes.

• Hands-on experience with Databricks for big data processing.

• Strong database management skills with SQL Server, Snowflake, and advanced SQL

scripting.

• Solid understanding of Unix/Linux commands.


General Requirements for Both Off-Shore Roles:

• Bachelor’s degree in computer science, Engineering, or a related field (or equivalent

experience).

• 5+ years of experience in cloud and data engineering roles.

• Strong problem-solving and analytical skills.

• Excellent communication and collaboration abilities.

• Proven ability to work in a fast-paced, agile environment.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Architect

Bengaluru, Karnataka SJ Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description: Cloud Data Architect


Job Overview


We are seeking a highly skilled and motivated Data Architect with 5–7 years of experience in data architecture, modeling, and enterprise data solutions. The ideal candidate will be familiar with Microsoft Fabric or equivalent modern data platforms, and possess hands-on expertise in designing scalable data models and systems. Prior consulting experience and a strong understanding of data integration, governance, and analytics design are highly desirable.


Key Responsibilities


- Design, implement, and maintain enterprise data architectures for analytics and operational use.

- Develop and optimize conceptual, logical, and physical data models that support business reporting and analytics needs.

- Lead end-to-end data solution designs using Microsoft Fabric, Azure Synapse, or similar platforms.

- Translate business and functional requirements into technical data architecture specifications.

- Collaborate with stakeholders, data engineers, and business teams to design scalable, secure, and high-performance data environments.

- Provide data integration strategies for structured data across multiple systems.

- Ensure adherence to data governance, data quality, and security standards in all architecture designs.

- Participate in architecture reviews, technical design sessions, and consulting engagements with internal or external clients.

- Support modernization efforts by guiding migration to cloud-native data platforms.


Required Qualifications


- 5–7 years in data architecture, data engineering, or analytics solution delivery.

- Hands-on experience with Microsoft Fabric , Azure Data Services (Data Lake, Synapse, Data Factory), or equivalent platforms like Snowflake, Databricks.

- Strong data modeling (dimensional and normalized), metadata management, and data lineage understanding.

- Proficiency in Python or R, SQL, DAX, Power BI, and data pipeline tools.

- Solid understanding of modern data warehouse/lakehouse architectures and enterprise data integration patterns.

- Ability to manage stakeholder expectations, translate business needs into technical requirements, and deliver value through data solutions.

- Strong communication and documentation skills.

- Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field.


Preferred Skills


- Experience with tools such as Microsoft Fabric, Azure Data Factory, Databricks, or similar platforms.

- Knowledge of Python or R for data manipulation or ML integration.

- Familiarity with DevOps or CI/CD for BI deployments.

- Exposure to data governance tools (e.g., Purview) and practices.

- Experience working with Agile/Scrum methodologies.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

Cai

Posted today

Job Viewed

Tap Again To Close

Job Description

Cloud Data Engineer

Req number:

R5934

Employment type:

Full time

Worksite flexibility:

Remote Who we are

CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise.

Job Summary

We are seeking a motivated Cloud Data Engineer that has experience in building data products using Databricks and related technologies. This is a Full-time and Remote position.

Job Description

What You’ll Do

  • Analyze and understand existing data warehouse implementations to support migration and consolidation efforts.
  • Reverse-engineer legacy stored procedures (PL/SQL, SQL) and translate business logic into scalable Spark SQL code within Databricks notebooks.
  • Design and develop data lake solutions on AWS using S3 and Delta Lake architecture, leveraging Databricks for processing and transformation.
  • Build and maintain robust data pipelines using ETL tools with ingestion into S3 and processing in Databricks.
  • Collaborate with data architects to implement ingestion and transformation frameworks aligned with enterprise standards.
  • Evaluate and optimize data models (Star, Snowflake, Flattened) for performance and scalability in the new platform.
  • Document ETL processes, data flows, and transformation logic to ensure transparency and maintainability.
  • Perform foundational data administration tasks including job scheduling, error troubleshooting, performance tuning, and backup coordination.
  • Work closely with cross-functional teams to ensure smooth transition and integration of data sources into the unified platform.
  • Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and backlog grooming.
  • Triage, debug and fix technical issues related to Data Lakes.
  • Maintain and Manage Code repositories like Git.

What You'll Need

  • 5+ years of experience working with Databricks , including Spark SQL and Delta Lake implementations.
  • 3 + years of experience in designing and implementing data lake architectures on Databricks.
  • Strong SQL and PL/SQL skills with the ability to interpret and refactor legacy stored procedures.
  • Hands-on experience with data modeling and warehouse design principles.
  • Proficiency in at least one programming language (Python, Scala, Java).
  • Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or related field.
  • Experience working in Agile environments and contributing to iterative development cycles. Experience working on Agile projects and Agile methodology in general.
  • Databricks cloud certification is a big plus.
  • Exposure to enterprise data governance and metadata management practices.

Physical Demands

  • This role involves mostly sedentary work, with occasional movement around the office to attend meetings, etc.
  • Ability to perform repetitive tasks on a computer, using a mouse, keyboard, and monitor.

Reasonable accommodation statement

If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to or (888) 824 – 8111.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Cloud Data Jobs