5,682 Gcp jobs in India
Cloud Infrastructure Engineer – GCP
Posted today
Job Viewed
Job Description
Key Responsibilities:
• Setup and manage GCP resources (VMs, buckets, VPCs, IAM)
• Automate provisioning using Terraform or Deployment Manager
• Manage cloud monitoring, alerting, and logging (Stackdriver)
• Implement GCP backup, patching, and lifecycle management
• Review and optimize billing and cost usage
•
Configuring monitoring
and
alerting
through terraform, creation of monitoring
dashboards
•
HA setup and DR setup
implementation.
• Will work on
defect fixes
during UAT across workstreams.
•
Documentation
of implementation steps/processes and "How to" articles
Skills Required:
• Google Professional Cloud Engineer certification preferred
• Proficient with gcloud CLI, Terraform, Bash scripting
• Familiarity with GCP security, audit logging, firewall rules
• Understanding of hybrid connectivity (Cloud VPN, Interconnect)
• Expertise in working with Terraform
• Understanding of networking concepts like IP segments, DNS
• Python, Powershell scripting skills
Lead/Architect Cloud Infrastructure- GCP
Posted today
Job Viewed
Job Description
Lead/Architect Cloud Infrastructure
Job Summary:
We are seeking a highly skilled and experienced Lead Infrastructure Engineer to join our dynamic team. The ideal candidate will be passionate about building and maintaining complex systems, with a holistic approach to architecture. You will play a key role in designing, implementing, and managing cloud infrastructure, ensuring scalability, availability, security, and optimal performance. You will also provide technical leadership and mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions.
Responsibilities:
Design, architect, and implement scalable, highly available, and secure infrastructure solutions, primarily on Google Cloud Platform (GCP) and/or Amazon Web Services (AWS).
- Develop and maintain Infrastructure as Code (IaC) using Terraform for enterprise-scale deployments.
- Utilize Kubernetes deployment tools such as Helm/Kustomize for container orchestration and management.
- Design and implement CI/CD pipelines using platforms like GitHub, GitLab, Bitbucket, Cloud Build, Harness, etc., with a focus on rolling deployments, canaries, and blue/green deployments.
- Ensure auditability and observability of pipeline states.
- Implement security best practices, audit, and compliance requirements within the infrastructure.
- Provide technical leadership, mentorship, and training to engineering staff.
- Engage with clients to understand their technical and business requirements, and provide tailored solutions.
- If needed, lead agile ceremonies and project planning, including developing agile boards and backlogs.
- Troubleshoot and resolve complex infrastructure issues.
Potentially participate in pre-sales activities and provide technical expertise to sales teams.
Qualifications:
- 10+ years of experience in an Infrastructure Engineer or similar role.
- Extensive experience with Google Cloud Platform (GCP) and/or Amazon Web Services (AWS).
- Proven ability to architect for scale, availability, and high-performance workloads.
- Deep knowledge of Infrastructure as Code (IaC) with Terraform.
- Strong experience with Kubernetes and related tools (Helm, Kustomize).
- Solid understanding of CI/CD pipelines and deployment strategies.
- Experience with security, audit, and compliance best practices.
- Excellent problem-solving and analytical skills.
- Strong communication and interpersonal skills, with the ability to engage with both technical and non-technical stakeholders.
- Experience in technical leadership and mentoring.
- Experience with client relationship management and project planning.
Certifications:
- Relevant certifications (e.g., Kubernetes Certified Administrator,Google Cloud Certified Professional Cloud Architect,Google Cloud Networking Certifications.Google Cloud Security Certifications etc.).
- Software development experience (e.g., Terraform, Python).
- Experience with machine learning infrastructure.
Education:
- Bachelor's degree in Computer Science, a related field, or equivalent experience.
GCP Engineer
Posted 4 days ago
Job Viewed
Job Description
We are looking for 4+yrs experienced candidate
Skills:
- Expertise in infrastructure as code (IaC) using Hashicorp products like Terraform, Sentinel, and Vault
- Ability to write custom modules and providers for Terraform, as well as create and manage custom policies for governance and security purposes
- Understanding of networking concepts, such as VPCs, private service connections, and cross-cloud connectivity
- Knowledge of IAM controls, including custom roles and advanced IAM features like PAM
- Familiarity with logging, monitoring, cost optimization, performance, reliability, and disaster recovery
- Understanding of data warehousing concepts and the ability to support the Lumi data platform
- Knowledge of advanced compute capabilities, such as GPUs, TPUs, and their use in training and inference for AI/ML models
- Strong programming and problem-solving skills, as demonstrated through coding exercises
Job Description:
- Develop and manage infrastructure as code (IaC) using Hashicorp products like Terraform, Sentinel, and Vault
- Write custom modules and providers for Terraform, as well as create and manage custom policies for governance and security purposes
- Integrate and manage networking aspects, such as VPCs, private service connections, and cross-cloud connectivity
- Implement and manage IAM controls, including custom roles and advanced IAM features like PAM
- Oversee logging and monitoring, cost optimization, performance, reliability, and disaster recovery
- Understand data warehousing concepts and support the Lumi data platform
- Possess knowledge of advanced compute capabilities, such as GPUs, TPUs, and their use in training and inference for AI/ML models
Gcp Engineer
Posted today
Job Viewed
Job Description
We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer.
In this role, you will:
- Proven experience as a Cloud Engineer or similar role, with a focus on Google Cloud Platform (GCP)
- Strong understanding of cloud architecture and services, including Compute Engine, Kubernetes Engine, Cloud Storage, BigQuery, and more.
- Setup and configure various GCP Services like, Cloud SQL, Cloud PubSub, cloud data store, configure GKE cluster for some multi-tenant environments, etc.
- Design, deploy, and manage scalable, secure, and reliable cloud infrastructure on Google Cloud Platform (GCP)
- Collaborate with development teams to ensure applications are designed and optimized for cloud deployment.
- Implement and manage CI/CD pipelines using tools such as Google Cloud Build, Jenkins, or similar.
- Monitor and optimize cloud resources for cost, performance, and security.
- Automate infrastructure provisioning experience with Infrastructure as Code (IaC) tools such as Terraform, Google Cloud Deployment Manager, or similar.
Qualifications - External
To be successful in this role, you should meet the following requirements:
- Proficiency in scripting languages such as Python, Bash, or similar.
- Troubleshoot and resolve issues related to cloud infrastructure and services.
- Ensure compliance with security policies and best practices.
- Google Cloud certifications (e.g., Google Cloud Professional Cloud Architect) are a plus.
- Stay up to date with the latest GCP features, services, and best practices.
- Any knowledge on other cloud like AWS, Azure will be added advantage.
GCP Engineer
Posted today
Job Viewed
Job Description
We are looking for 4+yrs experienced candidate
Skills:
- Expertise in infrastructure as code (IaC) using Hashicorp products like Terraform, Sentinel, and Vault
- Ability to write custom modules and providers for Terraform, as well as create and manage custom policies for governance and security purposes
- Understanding of networking concepts, such as VPCs, private service connections, and cross-cloud connectivity
- Knowledge of IAM controls, including custom roles and advanced IAM features like PAM
- Familiarity with logging, monitoring, cost optimization, performance, reliability, and disaster recovery
- Understanding of data warehousing concepts and the ability to support the Lumi data platform
- Knowledge of advanced compute capabilities, such as GPUs, TPUs, and their use in training and inference for AI/ML models
- Strong programming and problem-solving skills, as demonstrated through coding exercises
Job Description:
- Develop and manage infrastructure as code (IaC) using Hashicorp products like Terraform, Sentinel, and Vault
- Write custom modules and providers for Terraform, as well as create and manage custom policies for governance and security purposes
- Integrate and manage networking aspects, such as VPCs, private service connections, and cross-cloud connectivity
- Implement and manage IAM controls, including custom roles and advanced IAM features like PAM
- Oversee logging and monitoring, cost optimization, performance, reliability, and disaster recovery
- Understand data warehousing concepts and support the Lumi data platform
- Possess knowledge of advanced compute capabilities, such as GPUs, TPUs, and their use in training and inference for AI/ML models
GCP Engineer
Posted today
Job Viewed
Job Description
Job Requirements
Overview of the job
Data Engineer–Data Platforms
This role reports to the Director, India Data Platforms, P&G
About Data Platforms Team
We take pride in managing the most-valuable asset of company in Digital World, called Data. Our vision is to deliver Data as a competitive advantage for Asia Regional Business, by building unified data platforms, delivering customized BI tools for managers & empowering insightful business decisions through AI in Data. As a data solutions specialist, you'll be working closely with business stakeholders, collaborating to understand their needs and develop solutions to solve problems in area of supply chain, Sales & Distribution, Consumer Insights & Market performance.
In this role, you'll be constantly learning, staying up to date with industry trends and emerging technologies in data solutions. You'll have the chance to work with a variety of tools and technologies, including big data platforms, machine learning frameworks, and data visualization tools, to build innovative and effective solutions.
So, if you're excited about the possibilities of data, and eager to make a real impact in the world of business, a career in data solutions might be just what you're looking for. Join us and become a part of the future of digital transformation.
About P&G IT
Digital is at the core of P&G's accelerated growth strategy. With this vision, IT in P&G is deeply embedded into every critical process across business organizations comprising 11+ category units globally creating impactful value through Transformation, Simplification & Innovation. IT in P&G is sub-divided into teams that engage strongly for revolutionizing the business processes to deliver exceptional value & growth - Digital GTM, Digital Manufacturing, Marketing Technologist, Ecommerce, Data Sciences & Analytics, Data Solutions & Engineering, Product Supply.
Responsibilities
Development of data and analytics cloud-based platform, including integrated systems and implementing ELT/ ETL jobs to fulfil business deliverables. Performing sophisticated data operations such as data orchestration, transformation, and visualization with large datasets. You will be working with product managers to ensure superior product delivery to drive business value & transformation. Demonstrating standard coding practices to ensure delivery excellence and reusability.
- Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load it into Google Cloud environments.
- Data Transformation: Implement data transformation processes, including data cleansing, normalization, and aggregation, to ensure data quality and consistency.
- Data Modeling: Develop and maintain data models and schemas to support efficient data storage and retrieval in Google Cloud platforms.
- Data Warehousing: Build data warehouses or data lakes using Google Cloud services such as Big Query.
- Data Integration: Integrate data from multiple sources, both on-premises and cloud-based, using Cloud Composer or other relevant tools.
- Data Governance: Implement data governance practices, including data security, privacy, and compliance, to ensure data integrity and regulatory compliance.
- Performance Optimization: Optimize data pipelines and queries for improved performance and scalability in Google Cloud environments.
- Monitoring and Troubleshooting: Monitor data pipelines, identify and resolve performance issues, and troubleshoot data-related problems in collaboration with other teams.
- Data Visualization: Build BI reports to enable faster decision making.
- Collaboration: Work with product managers to ensure superior product delivery to drive business value & transformation
- Documentation: Document data engineering processes, data flows, and system configurations for future reference and knowledge sharing.
Work Experience
Qualifications:
- Experience: Bachelor's or master's degree in computer science, data engineering, or a related field, along with 2+ year work experience in data engineering and cloud platforms.
- Google Cloud Development: Strong proficiency in Google Cloud services such as Spanner, Cloud Composers, Looker Studio, etc.
- ETL Tools: Experience with ETL (Extract, Transform, Load) tools and frameworks, such as Spark and Cloud Composer/Airflow for data integration and transformation.
- Programming: Proficiency in programming languages such as PySpark, Python, and SQL for data manipulation, scripting, and automation.
- Data Modeling: Knowledge of data modeling techniques and experience with data modeling tools.
- Database Technologies: Familiarity with relational databases (e.g., Cloud SQL) for data storage and retrieval.
- Data Warehousing: Understanding of data warehousing concepts, dimensional modeling, and experience with data warehousing technologies such as Big Query.
- Data Governance: Knowledge of data governance principles, data security, privacy regulations (e.g., GDPR, CCPA), and experience implementing data governance practices.
- Data Visualization: Experience of working with Looker Studio to build semantic data model & BI reports/dashboards.
- Cloud Computing: Familiarity with cloud computing concepts and experience working with cloud platforms, particularly Google Cloud Platform.
- Problem-Solving: Strong analytical and problem-solving skills to identify and resolve data-related issues.
- Proficiency in DevOps Tools and CICD tools (e.g. Terraform, Github)
- Familiarity to Azure, Databricks and its relevant tech stacks would be an advantage to the role.
GCP Engineer
Posted today
Job Viewed
Job Description
Req ID:
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a GCP Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN).
GCP-Cloud Run
Exp- 5+ years
Notice Period- Immediate to 30 days
Job Description
:
- Senior Application Developer with Google Cloud Platform experience in BigQuery, SQL, CloudRun.
- Need a Senior Application Developer with GCP Skillset for a project involving re-design and re-platform of legacy Revenue Allocation system
- Mandatory Skills: GCP BigQuery, SQL, CloudRun
- Desired Skills: Linux Shell Scripting is a huge plus; Nice to have - Kafka, MQ Series, Oracle PL/SQL
About NTT DATA
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at
NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at .
This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Be The First To Know
About the latest Gcp Jobs in India !
GCP Engineer
Posted today
Job Viewed
Job Description
Requirements
Description and Requirements
As a GCP Data Engineer expertise, you will be responsible for managing, maintaining, and troubleshooting cloud data pipelines. The ideal candidate will have extensive experience in cloud data engineering, with in-depth knowledge of cloud platform services and data pipeline architecture along with independently tackling problems and troubleshooting issues. Additionally, you will leverage your software engineering skills to optimize data pipelines and enhance their reliability with your out of the box thinking and proactive approach.
Required skills:
- 5+ years of industry experience in the field of Data Engineering support and enhancement
- Proficient in any Cloud Platform services (GCP, Azure, AWS, etc.)
- Strong understanding of data pipeline architectures and ETL processes
- Excellent Python programming skills for data processing and automation
- Excellent SQL query writing skills for data analysis and Experience with relational databases
- Familiarity with version control systems like Git
- Ability to analyze, troubleshoot, and resolve complex data pipeline issues
- Software engineering experience in optimizing data pipelines to improve performance and reliability
- Continuously optimize data pipeline efficiency and reduce operational costs and reduce the number of issues/failures
- Automate repetitive tasks in data processing and management
- Experience in monitoring and alerting for Data Pipelines
- Continuously improve data pipeline reliability through analysis and testing
- Perform SLA-oriented monitoring for critical pipelines and provide suggestions as well as implement post-business approval for SLA adherence if needed
- Monitor performance and reliability of data pipelines, Informatica ETL workflows, MDM and Control-M jobs.
- Maintain infrastructure reliability for data pipelines, Informatica ETL workflows, MDM and Control-M jobs.
- Conduct post-incident reviews and implement improvements for data pipelines
- Develop and maintain documentation for data pipeline systems and processes
- Experience with Data Visualization using Google Looker Studio, Tableau, Domo, Power BI, or similar tools is an added advantage
- Excellent communication and documentation skills
- Strong problem-solving and analytical skills
- Open to work in a 24X7 shift
Additional Job Description
Qualifications:
- Bachelor's degree in Computer Science or related technical field, or equivalent practical experience
- Holding any Cloud Professional Data Engineer certification is an added advantage
- Excellent verbal and written communication skills
EEO Statement
At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent.
Equal Opportunity Employer
At TELUS Digital, we are proud to be an equal opportunity employer and are committed to creating a diverse and inclusive workplace. All aspects of employment, including the decision to hire and promote, are based on applicants' qualifications, merits, competence and performance without regard to any characteristic related to diversity.
GCP Engineer
Posted today
Job Viewed
Job Description
GCP Engineer
6+ years of experience
Pune/Bengaluru/Hyderabad
Looking for a workplace where people realize their full potential, are recognized for the impact they make, and enjoy the company of the peers they work with? Welcome to Zensar Read on for more details on the role and about us.
Here's how you'll contribute:
- 5+ years of hands-on experience with web application deployment and server management.
- Proficient in server operating systems (Linux, Ubuntu, CentOS, etc.).
- Strong knowledge of web servers (Apache, Nginx) and application servers , PHP, Python, etc.).
- Proficiency with cloud platforms like AWS, Google Cloud, Azure, or DigitalOcean.
- Experience in CI/CD pipelines, automation, and deployment tools (Git, GitHub Actions, Jenkins, etc.).
- Familiarity with Docker, containers, and virtualization technologies.
- Experience with database migration (MySQL, PostgreSQL, MongoDB, etc.).
- Understanding of DNS management, SSL configuration, firewalls, and basic network security.
Ability to troubleshoot and resolve performance or compatibility issues.
Advantage Zensar
We are a digital solutions and technology services company that partners with global organizations across industries to achieve digital transformation. With a strong track record of innovation, investment in digital solutions, and commitment to client success, at Zensar, you can help clients achieve new thresholds of performance. A subsidiary of RPG Group, Zensar has its HQ in India, and offices across the world, including Mexico, South Africa, UK and USA.
Zensar is all about celebrating individuality, creativity, innovation, and flexibility. We hire based on values, talent, and the potential necessary to fill a given job profile, irrespective of nationality, sexuality, race, color, and creed. We also put in policies to empower this assorted talent pool with the right environment for growth.
At Zensar, you Grow, Own, Achieve, Learn.
Learn more about our culture :
Ready to #ExperienceZensar?
Begin your application by clicking on the 'Apply Online' button below.
Be sure to have your resume handy
If you're having trouble applying, drop a line to
GCP Engineer
Posted today
Job Viewed
Job Description
It's fun to work in a company where people truly BELIEVE in what they are doing
We're committed to bringing passion and customer focus to the business.
Location - Open
Position: Data Engineer (GCP) – Technology
If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services.
*Responsibilities: *
- Be an integral part of large scale client business development and delivery engagements
- Develop the software and systems needed for end-to-end execution on large projects
- Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
- Build the knowledge base required to deliver increasingly complex technology projects
*Qualifications & Experience: *
A bachelor's degree in Computer Science or related field with 5 to 10 years of technology experience
*Desired Technical Skills: *
Data Engineering and Analytics on Google Cloud Platform:
Basic Cloud Computing Concepts
- Bigquery, Google Cloud Storage, Cloud SQL,
- PubSub, Dataflow, Cloud Composer, GCP Data Transfer,
gcloud CLI
Python, Google Cloud Python SDK, SQL
- Experience in working with Any NoSQL/Columnar / MPP Database
- Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.)
- Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture
*Other Desired Skills: *
- Excellent communication and co-ordination skills
- Problem understanding, articulation and solutioning
- Quick learner & adaptable with regards to new technologies
- Ability to research & solve technical issues
*Responsibilities: *
- Developing Data Pipelines (Batch/Streaming)
- Developing Complex data transformations
- ETL Orchestration
- Data Migration
- Develop and Maintain Datawarehouse / Data Lakes
*Good To Have: *
- Experience in working with Apache Spark / Kafka
- Machine Learning concepts
- Google Cloud Professional Data Engineer Certification
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us
Not the right fit? Let us know you're interested in a future opportunity by clicking
Introduce Yourself
in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest