GCP Architect

Posted 1 day ago
Job Viewed
Job Description
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a GCP Architect to join our team in Bangalore, Karnātaka (IN-KA), India (IN).
**GCP - Architect**
_Job Description_
**Role Summary**
**As a** **GCP Architect** **, you will lead the design and implementation of scalable, secure, and high-performance data solutions using Google Cloud Platform. You will define architectural standards, guide engineering teams, and collaborate with stakeholders to align data strategies with business goals. Your role will focus on leveraging Google Analytics capabilities to build enterprise-grade data platforms that support advanced analytics, machine learning, and real-time data processing.**
**Key Responsibilities:**
Design and Implementation: Develop and manage scalable cloud solutions using Google Cloud Platform (GCP) services, ensuring they meet performance and security standards.
+ Define and own the end-to-end architecture of data platforms built on Google Cloud, including ingestion, transformation, storage, and consumption layers.
+ Collaboration: Work closely with cross-functional teams, including developers, DevOps engineers, and business stakeholders, to align cloud solutions with business goals.
+ Assessment and Optimization: Evaluate existing systems, identify areas for improvement, and lead the migration of on-premises applications to the cloud.
+ Provide technical guidance, assistance and coaching on GCP best practices to engineering teams.
+ Develop and maintain documentation for cloud infrastructure, processes, and procedures.
+ Keep abreast of the latest GCP features and industry trends to continuously improve cloud architecture.
+ Proven expertise in implementing data governance and metadata management across workspaces and cloud environments
+ Collaborate with data scientists, engineers, and business stakeholders to support ML model lifecycle management using MLflow.
+ Demonstrated experience or familiarity with **AI systems** in integration into data.
+ Provide technical leadership in performance tuning, cost optimization, and cluster configuration.
+ Conduct architectural reviews, code audits, and mentoring sessions to ensure adherence to standards and scalability.
+ Stay current with GCP innovations and advocate for adoption of new features and capabilities.
**Skills / Qualifications:**
+ Bachelor's or master's degree in computer science, Software Engineering, Information Technology, or related field required.
+ 10+ years of experience in data architecture and engineering, with 5+ years in Google Cloud Platform.
+ Strong understanding of Lakehouse architecture, data mesh, and modern data stack principles.
+ Proven ability to design and implement secure, governed, and highly available data platforms.
+ Familiarity with cloud platforms (Azure, AWS) and their integration with GCP.
+ Experience with data modeling, dimensional modeling, and temporal data structures.
+ An understanding of E-R data models (conceptual, logical, and physical).
+ Understanding of advanced data warehouse concepts is required
+ Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translating them into technical designs and solutions.
+ Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across region roles and able to interact effectively with all levels.
+ Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.
#GenAINTT
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
GCP Architect

Posted 1 day ago
Job Viewed
Job Description
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a GCP Architect to join our team in Bangalore, Karnātaka (IN-KA), India (IN).
GCP - Architect
Job Description
**Role Summary**
**As a** **GCP Architect** **, you will lead the design and implementation of scalable, secure, and high-performance data solutions using Google Cloud Platform. You will define architectural standards, guide engineering teams, and collaborate with stakeholders to align data strategies with business goals. Your role will focus on leveraging Google Analytics capabilities to build enterprise-grade data platforms that support advanced analytics, machine learning, and real-time data processing.**
**Key Responsibilities:**
Design and Implementation: Develop and manage scalable cloud solutions using Google Cloud Platform (GCP) services, ensuring they meet performance and security standards.
+ Define and own the end-to-end architecture of data platforms built on Google Cloud, including ingestion, transformation, storage, and consumption layers.
+ Collaboration: Work closely with cross-functional teams, including developers, DevOps engineers, and business stakeholders, to align cloud solutions with business goals.
+ Assessment and Optimization: Evaluate existing systems, identify areas for improvement, and lead the migration of on-premises applications to the cloud.
+ Provide technical guidance, assistance and coaching on GCP best practices to engineering teams.
+ Develop and maintain documentation for cloud infrastructure, processes, and procedures.
+ Keep abreast of the latest GCP features and industry trends to continuously improve cloud architecture.
+ Proven expertise in implementing data governance and metadata management across workspaces and cloud environments
+ Collaborate with data scientists, engineers, and business stakeholders to support ML model lifecycle management using MLflow.
+ Demonstrated experience or familiarity with **AI systems** in integration into data.
+ Provide technical leadership in performance tuning, cost optimization, and cluster configuration.
+ Conduct architectural reviews, code audits, and mentoring sessions to ensure adherence to standards and scalability.
+ Stay current with GCP innovations and advocate for adoption of new features and capabilities.
**Skills / Qualifications:**
+ Bachelor's or master's degree in computer science, Software Engineering, Information Technology, or related field required.
+ 10+ years of experience in data architecture and engineering, with 5+ years in Google Cloud Platform.
+ Strong understanding of Lakehouse architecture, data mesh, and modern data stack principles.
+ Proven ability to design and implement secure, governed, and highly available data platforms.
+ Familiarity with cloud platforms (Azure, AWS) and their integration with GCP.
+ Experience with data modeling, dimensional modeling, and temporal data structures.
+ An understanding of E-R data models (conceptual, logical, and physical).
+ Understanding of advanced data warehouse concepts is required
+ Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translating them into technical designs and solutions.
+ Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across region roles and able to interact effectively with all levels.
+ Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.
#GenAINTT
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
GCP Architect
Posted 1 day ago
Job Viewed
Job Description
Position: GCP Architect
Location: Chennai / Bangalore
Duration: Long term contract
Please share suitable GCP Architect profiles, refer below JD:
As we need from GCP infra + Devops (C1) Architect.
· 10+ Years of experience
· GKE experience with design, implementation and support aspects.
· Experience with deploying and troubleshooting L7 and L4 gateways
· Experience with TLS, mTLS and certificates .
· Troubleshoot issue related network with GCP and GKE.
· Experience with OTEL, Keycloak, Splunk would be good.
· Hands on experience with Infrastructure-as-Code technologies: Terraform and helm
· DevOps – Experience with Github actions and workflow
Preferred Qualifications:
· GCP Professional Cloud Architect certification
· Experience in cloud migration
· Working knowledge of DevOps and automation tools
GCP Developer
Posted 1 day ago
Job Viewed
Job Description
About the Company
TCS
About the Role
GCP developer
Responsibilities
Required skills: GCP, Docker, Kubernetes, GIT, Kafka, Java, Springboot, CloudSQL
- Must have: Java 8 and above, Spring boot microservices architecture, Spring Batch, RestFul webservices
- Strong SQL, JDBC strings - Oracle and PostgreSQL DBs
- Jenkins & Maven
- Jira, Confluence
- Experience on GCP cloud platforms
NP: 30 days
Exp: 8 - 10 years
Required Skills
- GCP
- Docker
- Kubernetes
- GIT
- Kafka
- Java
- Springboot
- CloudSQL
Preferred Skills
- Java 8 and above
- Spring boot microservices architecture
- Spring Batch
- RestFul webservices
- Strong SQL
- JDBC strings - Oracle and PostgreSQL DBs
- Jenkins & Maven
- Jira
- Confluence
- Experience on GCP cloud platforms
GCP Architect
Posted 1 day ago
Job Viewed
Job Description
Greetings from TATA Consultancy Services!
Thank you for expressing your interest in exploring a career possibility with the TCS Family.
Hiring For: GCP Architect .
Location: Bangalore, Hyderabad, Pune, Kochi, Chennai.
Experience: 10 - 12 Years.
Mode Of Interview: Virtual Interview
Must Have:
- 5+ years of experience in cloud architecture and infrastructure design.
- 3+ years of hands-on experience specifically with GCP (Google Cloud Platform).
- Deep understanding of core GCP services: Compute Engine, GKE, Cloud Run, Cloud Functions, Cloud Storage, Big Query, Pub/Sub, Cloud Spanner, etc.
- Strong knowledge of cloud networking (VPC, Cloud NAT, VPNs, Interconnect), IAM, and security best practices.
- Experience with IaC (Terraform, Deployment Manager), CI/CD tools (Jenkins, GitLab CI, Cloud Build).
- Knowledge of data engineering pipelines on GCP (Dataflow, Dataproc, BigQuery).
- Excellent communication, presentation, and stakeholder management skills.
- Google Professional Cloud Architect (Highly Preferred)
Good To Have:
- Experience with hybrid and multi-cloud environments (ORACLE CLOUD, AWS, Azure).
- Familiarity with compliance frameworks (HIPAA, SOC 2, GDPR, etc.)
- Google Professional Cloud Security Engineer Certification
- Google Professional DevOps Engineer Certification
- Familiarity with F&A domain data and logistics
Talent Acquisition Group
Tata Consultancy Services
GCP Data Engineer
Posted 1 day ago
Job Viewed
Job Description
About the Company:
Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption. Brillio, renowned for its world-class professionals, referred to as "Brillians", distinguishes itself through their capacity to seamlessly integrate cutting-edge digital and design thinking skills with an unwavering dedication to client satisfaction.
Brillio takes pride in its status as an employer of choice, consistently attracting the most exceptional and talented individuals due to its unwavering emphasis on contemporary, groundbreaking technologies, and exclusive digital projects. Brillio's relentless commitment to providing an exceptional experience to its Brillians and nurturing their full potential consistently garners them the Great Place to Work® certification year after year.
About the Role:
We’re hiring a Senior GCP Data Engineer to design and build scalable data solutions using Google Cloud’s advanced tools. This role demands hands-on expertise in BigQuery, Dataflow, Airflow, and Python, with a strong foundation in SQL and large-scale data architecture. You’ll work on high-impact analytics platforms, collaborating across teams to deliver clean, secure, and efficient data pipelines.
Required Skills:
- 4+ years of experience in the data engineering field is preferred
- 3+ years of Hands-on experience in GCP cloud data implementation suite such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage
- Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms
- Hands on Strong Experience in the below technology
1. GBQ Query
2. Python
3. Apache Airflow
4. SQL (BigQuery preferred)
- Extensive hands-on experience working with data using SQL and Python
- Cloud Functions. Comparable skills in AWS and other cloud Big Data Engineering space is considered
- Experience with agile development methodologies
- Excellent verbal and written communications skills with the ability to clearly present ideas, concepts, and solutions
Qualifications
- Bachelor's Degree in Computer Science, Information Technology, or closely related discipline
Responsibilities:
- Design, build, and maintain scalable data pipelines using GCP tools such as BigQuery, Dataflow (Apache Beam), Pub/Sub, and Airflow/Composer
- Architect and operationalize large-scale data warehouses, data lakes, and analytics platforms
- Write efficient, production-grade code in Python and SQL for data transformation and analysis
- Implement and manage Cloud Functions and other GCP-native services for automation and integration
- Collaborate with cross-functional teams to understand data requirements and deliver robust solutions
- Ensure data quality, security, and performance across all stages of the pipeline
- Participate in Agile ceremonies and contribute to iterative development cycles
- Communicate technical concepts clearly to stakeholders through documentation and presentations
Preferred Skills
- Comparable skills in AWS and other cloud Big Data Engineering space is considered
For apply click here:
GCP Data Engineer
Posted 1 day ago
Job Viewed
Job Description
We are seeking an experienced GCP Data Engineer with 6+ years of experience to design, develop, and manage cloud-based data solutions on Google Cloud Platform (GCP). The ideal candidate will have expertise in BigQuery, Dataflow, Pub/Sub, Cloud Composer (Apache Airflow), and Terraform, along with strong experience in ETL/ELT pipelines, data modeling, and performance optimization.
Experience: 6-14 years
Locations: Bangalore, Mumbai, Pune, Chennai, Gurgaon, Noida
Key Responsibilities:
- Design & Implement Data Pipelines: Develop and optimize ETL/ELT pipelines using Dataflow, BigQuery, and Cloud Composer (Airflow).
- Data Integration: Work with structured and unstructured data sources, integrating data from on-premise and cloud-based systems.
- Data Warehousing & Modeling: Design high-performance data models in BigQuery, ensuring scalability and cost efficiency.
- Automation & Infrastructure as Code (IaC): Implement Terraform for provisioning GCP resources and automate deployments.
- Streaming & Batch Processing: Work with Pub/Sub, Dataflow (Apache Beam), and Kafka for real-time and batch data processing.
Required Skills & Qualifications:
- Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
- 6+ years of experience in data engineering, cloud data solutions, and pipeline development.
- GCP Expertise: Hands-on experience with BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer (Airflow), Vertex AI, and IAM Policies.
- Programming: Proficiency in Python, SQL, and Apache Beam (Java or Scala is a plus).
Be The First To Know
About the latest Gcp Jobs in Bangalore !
GCP Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Greetings from TCS Human Resources Team!
TCS is hiring for Pune, Bangalore, Delhi NCR location.
Skill: GCP Services (DataProc, Big Query, Composer) , Pyspark
Years of experience: 6 to 10 Years
Notice Period: Immediate Joiners only
Roles and Responsibilities:
- Good Knowledge of Google Platform Components- Big Query, GCS Bucket, Composer, Dataproc
- 5+ years of experience developing ETL process, performing data aggregation and data analysis
- Proficiency in SQL and strong knowledge / practice in PySpark
- Business problem solving, critical thinking and information analysis
- Able to perform Unit testing, SIT and UAT
Interested professionals send your updated CV & the below details to
Full Name:
Email:
Contact Details:
Current Location:
Total Experience:
Current CTC:
Expected CTC:
Notice Period:
Current Company Name:
Education or career gap:
Reason for gap:
Highest Education qualification:
Highest Qualification Fulltime (Y/N):
University Name:
EP Reference Number (if already registered with TCS) :
GCP & GKE Engineer
Posted 1 day ago
Job Viewed
Job Description
GCP + GKE + DevOps Engineer – Bangalore or Gurugram (Hybrid)
We are seeking a hands-on GCP + GKE + DevOps Engineer to join a major enterprise engineering program. This is a design and delivery–focused role requiring deep expertise across Google Cloud, Kubernetes, and modern CI/CD automation frameworks.
You will be responsible for building, deploying, and maintaining scalable infrastructure and DevOps pipelines in a cloud-native environment, collaborating with cross-functional engineering and platform teams.
Skills & Experience:
- Strong, hands-on experience with Google Cloud Platform (GCP)
- Expertise in Kubernetes and Google Kubernetes Engine (GKE)
- Proven experience implementing CI/CD pipelines using Jenkins, ArgoCD, HashiCorp, and GitHub Actions (GHA)
- Solid understanding of containerization, orchestration, and infrastructure as code
- Experience designing and maintaining cloud-native applications and DevOps automation frameworks
- Familiarity with monitoring, logging, and security best practices in GCP
- Ability to troubleshoot complex deployment and scaling issues in distributed environments
- Strong collaboration skills, working within Agile and cross-functional engineering teams
General Responsibilities:
- Design, implement, and maintain cloud infrastructure on GCP and GKE
- Automate deployments, configuration, and scaling using DevOps tools
- Collaborate with engineering teams to streamline CI/CD workflows and delivery processes
- Drive best practices for reliability, scalability, and performance in a production environment
Interviews are taking place immediately — apply now to join a leading enterprise project.
GCP Data Engineer
Posted 1 day ago
Job Viewed
Job Description
About the Organization-
Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth.
Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon.
Locations- Bengaluru, & Gurgaon
Job Summary:
We are seeking experienced Data Engineering professionals with 4–6 years of hands-on expertise in Big Data technologies, with a focus on building scalable data solutions using Google Cloud Platform (GCP).
Key Skills & Experience:
- Proven expertise in PySpark (DataFrame and SparkSQL), Hadoop, and Hive
- Strong programming skills in Python and Bash
- Solid understanding of SQL and data warehousing concepts
- Demonstrated analytical and problem-solving abilities, particularly in data analysis and troubleshooting
- Innovative thinker with a passion for building efficient, scalable data solutions
- Excellent verbal and written communication skills, with the ability to work collaboratively across teams
Preferred/Good to Have:
- Experience with GCP services such as BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and IAM
- Familiarity with Airflow or other orchestration tools
- Exposure to cloud migration projects (on-prem to GCP or cloud-to-cloud)
Roles & Responsibilities:
- Design and develop robust and scalable ETL pipelines on GCP to meet business needs
- Ensure code quality and performance by adhering to development best practices and standards
- Perform integration testing and troubleshoot pipeline issues across environments
- Estimate efforts for development, testing, and deployment tasks
- Participate in code reviews and provide feedback to maintain high development standards
- Where applicable, design cost-effective data pipelines leveraging GCP-native services
For Quick Response- Interested Candidates can directly share their resume along with the details like Notice Period, Current CTC and Expected CTC at