1,558 Data Cloud jobs in India
Job No Longer Available
This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.
However, we have similar jobs available for you below.
Data Cloud Architect
Posted 20 days ago
Job Viewed
Job Description
People Tech Group , a global technology and consulting leader with offices across the U.S., India, Canada, and the Middle East, serving clients in over 10 countries.
Our organization operates through key business units:
- People Tech Technologies – Enterprise IT, Cloud, Data, ERP, and Salesforce
- People Media – Digital content and marketing solutions
- People Infra – Infrastructure, IoT, and managed services
- People Enterprises – Business consulting and enterprise transformation
With 20+ global data centers, we deliver secure, high-performance, and scalable technology solutions to top-tier clients.
One of the Recent Development happened with the company, we have got acquired by Quest Global Company, Quest Global is One of the world’s largest engineering Solution provider Company, It has 20,000+ employee strength, with 70+ Global Delivery Service centers, Headquarters are based in Singapore. Going forward, we all are part of Quest Global Company.
---
Job Description:
Experience: 10 + Years
Notice Period: Immediate-15 Days
Location: Bangalore
We are looking for a hands-on AWS Data Warehouse Migration Architect to lead the modernization of an on-premises Oracle-based legacy data warehouse system to AWS GovCloud. This critical role will ensure full compliance with security, compliance, and performance standards, leveraging AWS-native services and secure cloud design patterns.
Required Skills and Experience:
- Good Experience in enterprise data warehouse architecture and implementation.
- Good Experience with Oracle databases, PL/SQL, SQL, partitioning, and tuning.
- Experience of deep hands-on experience with AWS Data services: Redshift, Glue, DMS, Athena, S3, IAM.
- Direct experience working with AWS GovCloud and knowledge of its compliance posture is a Plus.
- Any Proven background in FedRAMP, NIST 800-53, ITAR/DFARS regulatory compliance.
- Strong skills in Terraform or CloudFormation for infrastructure-as-code.
- Experience in CI/CD pipelines for data, and automated validation frameworks.
Data Cloud Architect
Posted 20 days ago
Job Viewed
Job Description
Our organization operates through key business units:
People Tech Technologies – Enterprise IT, Cloud, Data, ERP, and Salesforce
People Media – Digital content and marketing solutions
People Infra – Infrastructure, IoT, and managed services
People Enterprises – Business consulting and enterprise transformation
With 20+ global data centers, we deliver secure, high-performance, and scalable technology solutions to top-tier clients.
One of the Recent Development happened with the company, we have got acquired by Quest Global Company, Quest Global is One of the world’s largest engineering Solution provider Company, It has 20,000+ employee strength, with 70+ Global Delivery Service centers, Headquarters are based in Singapore. Going forward, we all are part of Quest Global Company.
---
Job Description:
Experience: 10 + Years
Notice Period: Immediate-15 Days
Location: Bangalore
We are looking for a hands-on AWS Data Warehouse Migration Architect to lead the modernization of an on-premises Oracle-based legacy data warehouse system to AWS GovCloud. This critical role will ensure full compliance with security, compliance, and performance standards, leveraging AWS-native services and secure cloud design patterns.
Required Skills and Experience:
Good Experience in enterprise data warehouse architecture and implementation.
Good Experience with Oracle databases, PL/SQL, SQL, partitioning, and tuning.
Experience of deep hands-on experience with AWS Data services: Redshift, Glue, DMS, Athena, S3, IAM.
Direct experience working with AWS GovCloud and knowledge of its compliance posture is a Plus.
Any Proven background in FedRAMP, NIST 800-53, ITAR/DFARS regulatory compliance.
Strong skills in Terraform or CloudFormation for infrastructure-as-code.
Experience in CI/CD pipelines for data, and automated validation frameworks.
Data Cloud Architect
Posted today
Job Viewed
Job Description
People Tech Group , a global technology and consulting leader with offices across the U.S., India, Canada, and the Middle East, serving clients in over 10 countries.
Our organization operates through key business units:
- People Tech Technologies – Enterprise IT, Cloud, Data, ERP, and Salesforce
- People Media – Digital content and marketing solutions
- People Infra – Infrastructure, IoT, and managed services
- People Enterprises – Business consulting and enterprise transformation
With 20+ global data centers, we deliver secure, high-performance, and scalable technology solutions to top-tier clients.
One of the Recent Development happened with the company, we have got acquired by Quest Global Company, Quest Global is One of the world’s largest engineering Solution provider Company, It has 20,000+ employee strength, with 70+ Global Delivery Service centers, Headquarters are based in Singapore. Going forward, we all are part of Quest Global Company.
---
Job Description:
Experience: 10 + Years
Notice Period: Immediate-15 Days
Location: Bangalore
We are looking for a hands-on AWS Data Warehouse Migration Architect to lead the modernization of an on-premises Oracle-based legacy data warehouse system to AWS GovCloud. This critical role will ensure full compliance with security, compliance, and performance standards, leveraging AWS-native services and secure cloud design patterns.
Required Skills and Experience:
- Good Experience in enterprise data warehouse architecture and implementation.
- Good Experience with Oracle databases, PL/SQL, SQL, partitioning, and tuning.
- Experience of deep hands-on experience with AWS Data services: Redshift, Glue, DMS, Athena, S3, IAM.
- Direct experience working with AWS GovCloud and knowledge of its compliance posture is a Plus.
- Any Proven background in FedRAMP, NIST 800-53, ITAR/DFARS regulatory compliance.
- Strong skills in Terraform or CloudFormation for infrastructure-as-code.
- Experience in CI/CD pipelines for data, and automated validation frameworks.
Data cloud Engineer
Posted today
Job Viewed
Job Description
Data Cloud SME
Responsibile for setting up Data Cloud, Knows data ingestion techniques, data harmonization, integrated with other
systems. Knows how to do segementation, derive analytics out of it.
Experience with CDP's preferably Salesforce CDP and proven experience of working on more than 4-5 projects of CDP.
Experience range - 6-8 years.
Proficiency with other Salesforce clouds beneficial
Proficiency with Data Lake, Data SQL Techniques, Transformation, Data Governance is must.
Data Cloud Architect
Posted 5 days ago
Job Viewed
Job Description
People Tech Group , a global technology and consulting leader with offices across the U.S., India, Canada, and the Middle East, serving clients in over 10 countries.
Our organization operates through key business units:
- People Tech Technologies – Enterprise IT, Cloud, Data, ERP, and Salesforce
- People Media – Digital content and marketing solutions
- People Infra – Infrastructure, IoT, and managed services
- People Enterprises – Business consulting and enterprise transformation
With 20+ global data centers, we deliver secure, high-performance, and scalable technology solutions to top-tier clients.
One of the Recent Development happened with the company, we have got acquired by Quest Global Company, Quest Global is One of the world’s largest engineering Solution provider Company, It has 20,000+ employee strength, with 70+ Global Delivery Service centers, Headquarters are based in Singapore. Going forward, we all are part of Quest Global Company.
---
Job Description:
Experience: 10 + Years
Notice Period: Immediate-15 Days
Location: Bangalore
We are looking for a hands-on AWS Data Warehouse Migration Architect to lead the modernization of an on-premises Oracle-based legacy data warehouse system to AWS GovCloud. This critical role will ensure full compliance with security, compliance, and performance standards, leveraging AWS-native services and secure cloud design patterns.
Required Skills and Experience:
- Good Experience in enterprise data warehouse architecture and implementation.
- Good Experience with Oracle databases, PL/SQL, SQL, partitioning, and tuning.
- Experience of deep hands-on experience with AWS Data services: Redshift, Glue, DMS, Athena, S3, IAM.
- Direct experience working with AWS GovCloud and knowledge of its compliance posture is a Plus.
- Any Proven background in FedRAMP, NIST 800-53, ITAR/DFARS regulatory compliance.
- Strong skills in Terraform or CloudFormation for infrastructure-as-code.
- Experience in CI/CD pipelines for data, and automated validation frameworks.
Sr. Data Cloud Architect

Posted 2 days ago
Job Viewed
Job Description
We are seeking an experienced Data Architect to design and develop a scalable, secure, and efficient data platform. The successful candidate will have a strong background in cloud computing, and software architecture and implementation with a passion for technology innovation and cross functional collaboration. The data Architect will be responsible for designing and implementing the data architecture platform, ensuring seamless integration with our existing systems, and providing technical leadership to the development team.
GE Healthcare is a leading global medical technology and digital solutions innovator. Our mission is to improve lives in the moments that matter. Unlock your ambition, turn ideas into world-changing realities, and join an organization where every voice makes a difference, and every difference builds a healthier world.
**Job Description**
Roles and Responsibilities
**In this role, you will:**
+ In Depth knowledge and hands on exp using python (pyspark) on real time data streaming technology - Kafka, spark, lambda
+ Experience in building data processing pipelines on AWS using AWS MSK, EMR (Spark Streaming), Dynamo DB , Lambda , Glue, Athena
+ Knowledge of device data ingestion and processing using AWS IoT core, IoT rules and event bridge
+ Design, implement and optimize Kafka and Spark based NRT data processing pipelines
+ Expertise in building reusable, cloud native, scalable and reliable frameworks, and tools
+ Design and implement reusable and cost-effective solution to meet functional and nonfunctional requirements like availability, latency, fault tolerance
**Architectural & Design Skills**
+ Designing scalable data pipelines for IoT telemetry
+ Real-time vs batch processing architecture
+ Data governance, security, and compliance
+ Cost optimization strategies on AWS
**Technical Skill Set**
**Cloud & Infrastructure (AWS)**
+ Amazon EMR - for big data processing using Spark/Hadoop
+ AWS Lambda, Step Functions - for serverless workflows
+ S3, DynamoDB, RDS - for data storage and management
+ IAM, KMS, CloudWatch, CloudTrail - for security and monitoring
+ AWS IoT Core for IoT device integration
**Big Data & Analytics**
+ Apache Spark & PySpark - for distributed data processing
+ Data ingestion using Kinesis, Kafka, or AWS IoT Analytics
+ ETL pipeline design and optimization
+ Data lake architecture using S3 + Glue + Athena
**Programming & Scripting**
+ Python - core language for scripting, automation, and data processing
+ Boto3 - AWS SDK for Python
+ SQL - for querying structured data
+ Shell scripting - for automation on EMR or EC2
**Education Qualification**
Bachelor's degree in engineering with minimum 5+ years of experience in relevant technologies.
**Desired Characteristics**
**Technical Expertise:**
+ Excellent knowledge of software design and coding principles
+ Experience working in an Agile environment
+ Familiarity with versatile implementation options
+ Demonstrates knowledge on technical topics, such as caching, APIs, data transfer, scalability, and security
+ Experience in building and managing big data solutions, Data Lakes, Data Warehouses, Data Integration, Data Migration, and Business Intelligence/Artificial Intelligence solutions on the Cloud (AWS)
+ Experience in architecting and implementing data mesh and data fabric solutions specifically leveraging AWS services, including designing domain-oriented data architectures, data products, and data access patterns in a multi-tenant environment.
+ Expertise in API integration, Subscription based APIs, Multi tenancy, enabling efficient data exchange and synchronization between various applications and platforms.
+ Familiarity with advanced data management principles and best practices within AWS environments, including data as a service, data modelling, data lineage, data cataloguing, and metadata management.
+ Develop and maintain data models, schemas, and databases while ensuring high performance, security, and reliability in a global context.
+ Expertise in data modelling, database design principles, and best practices for data management within a global context.
**Business Acumen:**
+ Demonstrates the initiative to explore alternate technology and approaches to solving problems
+ Skilled in breaking down problems, documenting problem statements and estimating efforts
+ Has the ability to analyze impact of technology choices
+ Skilled in negotiation to align stakeholders and communicate a single synthesized perspective to the scrum team. Balances value propositions for competing stakeholders.
+ Demonstrates knowledge of the competitive environment
+ Demonstrates knowledge of technologies in the market to help make buy vs build recommendations, scope MVPs, and to drive market timing decisions
**Leadership:**
+ Influences through others; builds direct and "behind the scenes" support for ideas. Pre-emptively sees downstream consequences and effectively tailors influencing strategy to support a positive outcome.
+ Able to verbalize what is behind decisions and downstream implications. Continuously reflecting on success and failures to improve performance and decision-making.
+ Understands when change is needed. Participates in technical strategy planning.
**Personal Attributes:**
+ Able to effectively direct and mentor others in critical thinking skills.
+ Proactively engages with cross-functional teams to resolve issues and design solutions using critical thinking and analysis skills and best practices.
+ Finds important patterns in seemingly unrelated information.
+ Influences and energizes other toward the common vision and goal.
+ Maintains excitement for a process and drives to new directions of meeting the goal even when odds and setbacks render one path impassable.
+ Innovates and integrates new processes and/or technology to significantly add value to GE Healthcare.
+ Identifies how the cost of change weighs against the benefits and advises accordingly.
+ Proactively learns new solutions and processes to address seemingly unanswerable problems.
**Inclusion and Diversity**
GE Healthcare is an Equal Opportunity Employer where inclusion matters. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law.
We expect all employees to live and breathe our behaviors: to act with humility and build trust; lead with transparency; deliver with focus, and drive ownership - always with unyielding integrity.
Our **total rewards** are designed to unlock your ambition by giving you the boost and flexibility you need to turn your ideas into world-changing realities. Our salary and benefits are everything you'd expect from an organization with global strength and scale, and you'll be surrounded by career opportunities in a culture that fosters care, collaboration and support.
#LI-RS1
#Hybrid
**Additional Information**
**Relocation Assistance Provided:** No
Technical Architect - Data & Cloud
Posted today
Job Viewed
Job Description
Role- Technical Architect- Big Data & Cloud
Location- Indore, Bangalore, Noida, Gurgaon, Pune, Hyderabad
Job Description-
- BTech degree in computer science, engineering or related field of study or 12+ years of related work experience
- 7+ years design & implementation experience with large scale data centric distributed applications
- Professional experience architecting, operating cloud-based solutions with good understanding of core disciplines like compute, networking, storage, security, databases etc.
- Good understanding of data engineering concepts like storage, governance, cataloging, data quality, data modeling etc.
- Good understanding about various architecture patterns like data lake, data lake house, data mesh etc.
- Good understanding of Data Warehousing concepts, hands-on experience working with tools like Hive, Redshift, Snowflake, Teradata etc.
- Experience migrating or transforming legacy customer solutions to the cloud.
- Experience working with services like AWS EMR, Glue, DMS, Kinesis, RDS, Redshift, Dynamo DB, Document DB, SNS, SQS, Lambda, EKS, Data Zone etc.
- Thorough understanding of Big Data ecosystem technologies like Hadoop, Spark, Hive, HBase etc. and other competent tools and technologies
- Understanding in designing analytical solutions leveraging AWS cognitive services like Textract, Comprehend, Rekognition etc. in combination with Sagemaker is good to have.
- Experience working with modern development workflows, such as git, continuous integration/continuous deployment pipelines, static code analysis tooling, infrastructure-as-code, and more.
- Experience with a programming or scripting language – Python/Java/Scala
- AWS Professional/Specialty certification or relevant cloud expertise
Roles & Responsibilities
- Drive innovation within Data Engineering domain by designing reusable and reliable accelerators, blueprints, and libraries.
- Capable of leading a technology team, inculcating innovative mindset and enable fast paced deliveries.
- Able to adapt to new technologies, learn quickly, and manage high ambiguity.
- Ability to work with business stakeholders, attend/drive various architectural, design and status calls with multiple stakeholders.
- Exhibit good presentation skills with a high degree of comfort speaking with executives, IT Management, and developers.
- Drive technology/software sales or pre-sales consulting discussions
- Ensure end-to-end ownership of all tasks being aligned.
- Ensure high quality software development with complete documentation and traceability.
- Fulfil organizational responsibilities (sharing knowledge & experience with other teams / groups)
- Conduct technical training(s)/session(s), write whitepapers/ case studies / blogs etc.
Be The First To Know
About the latest Data cloud Jobs in India !
Big Data Cloud Engineer
Posted 20 days ago
Job Viewed
Job Description
Role: Big Data Engineer
Required Experience: 3 – 6 years
Job Location: Hyderabad
Job Description:
Must have Skills: Spark, Python/ Scala, AWS/Azure, Snowflake, Databricks, SQL Server/NoSQL.
Key Responsibilities:
- Design and implement data pipelines for batch and real-time data processing.
- Optimize data storage solutions for efficiency and scalability.
- Collaborate with analysts and business teams to meet data requirements.
- Monitor data pipeline performance and troubleshoot issues.
- Ensure compliance with data security and privacy policies.
Skills Required:
- Proficiency in Python, SQL, and ETL frameworks.
- Experience with big data tools (Spark, Hadoop).
- Strong knowledge of cloud services and databases.
- Familiarity with data modeling and warehousing concepts.