260 Gcp jobs in Mumbai

GCP Engineer

Mumbai, Maharashtra ₹14400 - ₹1500000 Y FullThrottle Labs Pvt Ltd

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

We are looking for a skilled *GCP Engineer* with strong expertise in building and scaling data pipelines, APIs, and cloud-native solutions. The ideal candidate will have deep experience in *Google Cloud Platform (GCP)* services, excellent Python skills, and a solid foundation in DevOps practices. This role requires a mix of *data engineering expertise* and the ability to design *robust, scalable systems* for data processing and API integration.

*Key Responsibilities*
  • Design, develop, and maintain data pipelines and workflows using *GCP services* such as *BigQuery, Cloud Composer, Dataflow, Dataform, and Pub/Sub*.

  • Build, optimize, and manage scalable *ETL/ELT processes* and ensure efficient data flow across systems.

  • Develop and deploy *Python-based solutions*, scripts, and automation frameworks.

  • Implement *API development* and ensure high performance, reliability, and scalability of APIs and data pipelines.

  • Collaborate with cross-functional teams to integrate data solutions into business-critical applications.

  • Apply *DevOps practices*: manage source control (Git), automate deployments, and maintain CI/CD pipelines.

  • Monitor, troubleshoot, and optimize system performance, data reliability, and cost efficiency.

  • Contribute to architectural discussions on data infrastructure, scalability, and cloud adoption.

*Required Qualifications*
  • *Hands-on experience with GCP services*: BigQuery, Cloud Composer, Dataflow, Dataform, Pub/Sub.

  • Strong proficiency in *Python* for data engineering and automation.

  • Experience with *DevOps tools and practices*: Git, CI/CD workflows.

  • Solid understanding of *data engineering concepts*: data modeling, pipeline orchestration, streaming & batch processing.

  • Experience with *API development* and building scalable integration layers.

*Nice to Have*
  • Experience with *ElasticSearch* (setup, optimization, and integration).

  • Background in *data infrastructure scalability* and performance optimization.

  • Familiarity with monitoring tools, observability, and cost management on GCP.

Job Type: Full-time

Pay: ₹1,200, ₹1,500,000.00 per year

Work Location: In person

Speak with the employer

This advertiser has chosen not to accept applicants from your region.

GCP Engineer

Mumbai, Maharashtra ₹1500000 - ₹2500000 Y Procter & Gamble

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Job Requirements
Overview of the job
Data Engineer–Data Platforms

This role reports to the Director, India Data Platforms, P&G

About Data Platforms Team
We take pride in managing the most-valuable asset of company in Digital World, called Data. Our vision is to deliver Data as a competitive advantage for Asia Regional Business, by building unified data platforms, delivering customized BI tools for managers & empowering insightful business decisions through AI in Data. As a data solutions specialist, you'll be working closely with business stakeholders, collaborating to understand their needs and develop solutions to solve problems in area of supply chain, Sales & Distribution, Consumer Insights & Market performance.

In this role, you'll be constantly learning, staying up to date with industry trends and emerging technologies in data solutions. You'll have the chance to work with a variety of tools and technologies, including big data platforms, machine learning frameworks, and data visualization tools, to build innovative and effective solutions.

So, if you're excited about the possibilities of data, and eager to make a real impact in the world of business, a career in data solutions might be just what you're looking for. Join us and become a part of the future of digital transformation.

About P&G IT
Digital is at the core of P&G's accelerated growth strategy. With this vision, IT in P&G is deeply embedded into every critical process across business organizations comprising 11+ category units globally creating impactful value through Transformation, Simplification & Innovation. IT in P&G is sub-divided into teams that engage strongly for revolutionizing the business processes to deliver exceptional value & growth - Digital GTM, Digital Manufacturing, Marketing Technologist, Ecommerce, Data Sciences & Analytics, Data Solutions & Engineering, Product Supply.

Responsibilities
Development of data and analytics cloud-based platform, including integrated systems and implementing ELT/ ETL jobs to fulfil business deliverables. Performing sophisticated data operations such as data orchestration, transformation, and visualization with large datasets. You will be working with product managers to ensure superior product delivery to drive business value & transformation. Demonstrating standard coding practices to ensure delivery excellence and reusability.

  • Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load it into Google Cloud environments.
  • Data Transformation: Implement data transformation processes, including data cleansing, normalization, and aggregation, to ensure data quality and consistency.
  • Data Modeling: Develop and maintain data models and schemas to support efficient data storage and retrieval in Google Cloud platforms.
  • Data Warehousing: Build data warehouses or data lakes using Google Cloud services such as Big Query.
  • Data Integration: Integrate data from multiple sources, both on-premises and cloud-based, using Cloud Composer or other relevant tools.
  • Data Governance: Implement data governance practices, including data security, privacy, and compliance, to ensure data integrity and regulatory compliance.
  • Performance Optimization: Optimize data pipelines and queries for improved performance and scalability in Google Cloud environments.
  • Monitoring and Troubleshooting: Monitor data pipelines, identify and resolve performance issues, and troubleshoot data-related problems in collaboration with other teams.
  • Data Visualization: Build BI reports to enable faster decision making.
  • Collaboration: Work with product managers to ensure superior product delivery to drive business value & transformation
  • Documentation: Document data engineering processes, data flows, and system configurations for future reference and knowledge sharing.

Work Experience
Qualifications:

  • Experience: Bachelor's or master's degree in computer science, data engineering, or a related field, along with 2+ year work experience in data engineering and cloud platforms.
  • Google Cloud Development: Strong proficiency in Google Cloud services such as Spanner, Cloud Composers, Looker Studio, etc.
  • ETL Tools: Experience with ETL (Extract, Transform, Load) tools and frameworks, such as Spark and Cloud Composer/Airflow for data integration and transformation.
  • Programming: Proficiency in programming languages such as PySpark, Python, and SQL for data manipulation, scripting, and automation.
  • Data Modeling: Knowledge of data modeling techniques and experience with data modeling tools.
  • Database Technologies: Familiarity with relational databases (e.g., Cloud SQL) for data storage and retrieval.
  • Data Warehousing: Understanding of data warehousing concepts, dimensional modeling, and experience with data warehousing technologies such as Big Query.
  • Data Governance: Knowledge of data governance principles, data security, privacy regulations (e.g., GDPR, CCPA), and experience implementing data governance practices.
  • Data Visualization: Experience of working with Looker Studio to build semantic data model & BI reports/dashboards.
  • Cloud Computing: Familiarity with cloud computing concepts and experience working with cloud platforms, particularly Google Cloud Platform.
  • Problem-Solving: Strong analytical and problem-solving skills to identify and resolve data-related issues.
  • Proficiency in DevOps Tools and CICD tools (e.g. Terraform, Github)
  • Familiarity to Azure, Databricks and its relevant tech stacks would be an advantage to the role.
This advertiser has chosen not to accept applicants from your region.

Senior GCP Engineer

Mumbai, Maharashtra Fractal

Posted today

Job Viewed

Tap Again To Close

Job Description

It's fun to work in a company where people truly BELIEVE in what they are doing!

Responsibilities

  • Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. 
  • Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, Presto, Druid, airflow 
  • Deep understanding of BigQuery architecture, best practices, and performance optimization. 
  • Proficiency in LookML for building data models and metrics. 
  • Experience with DataProc for running Hadoop/ Spark jobs on GCP. 
  • Knowledge of configuring and optimizing DataProc clusters. 
  • Offer system support as part of a support rotation with other team members.
  • Operationalize open source data-analytic tools for enterprise use.
  • Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification.
  • Understand and follow the company development lifecycle to develop, deploy and deliver the solutions.

  • Minimum Qualifications: 
    • Bachelor's degree in Computer Science, CIS, or related field 
    • Experience on project(s) involving the implementation of software development life cycles (SDLC) GCP DATA ENGINEER

    Required Skills & Qualifications:

  • Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • 7+ years of experience in data engineering, cloud data solutions, and pipeline development.
  • GCP Expertise: Hands-on experience with BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer (Airflow), Vertex AI, and IAM Policies.
  • Programming: Proficiency in Python, SQL, and Apache Beam (Java or Scala is a plus).
  • If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

    Not the right fit? Let us know you're interested in a future opportunity by clickingin the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

    This advertiser has chosen not to accept applicants from your region.

    GCP Data Engineer

    Mumbai, Maharashtra Kyndryl

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    **Who We Are**
    At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities.
    **The Role**
    Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips.
    As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation.
    In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation.
    Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset-a true data alchemist.
    Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made - and your lifecycle management expertise will ensure our data remains fresh and impactful.
    So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth.
    Your Future at Kyndryl
    Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here.
    **Who You Are**
    You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others.
    Required Skills and Experience
    + 7 + years of experience in data engineering, with a focus on GCP technologies.
    + Design, implement, and maintain scalable ETL/ELT pipelines using tools like Dataflow, Apache Beam, or Cloud Composer.
    + Develop and optimize data models in BigQuery for efficient analytics and reporting.
    + Utilize Google Cloud tools like Cloud Storage, BigQuery, Pub/Sub, and Dataproc for data processing and storage.
    + Optimize infrastructure for cost efficiency and performance.
    + BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Storage.
    + Terraform or Deployment Manager for infrastructure as code.
    + SQL and Python/Java/Scala for data processing and scripting.
    + Strong understanding of data warehousing concepts, star/snowflake schemas, and data modeling.
    + Experience with CI/CD pipelines and version control systems (e.g., Git, Jenkins).
    + Solid knowledge of networking, security, and compliance within a GCP environment.
    + Proven experience in managing and optimizing multi-terabyte datasets.
    + Familiarity with additional cloud platforms (AWS, Azure) and hybrid-cloud architectures.
    + Experience with BI tools such as Looker, Tableau, or Power BI.
    + Knowledge of machine learning workflows and integration with AI tools.
    + Expertise in data mining, data storage and Extract-Transform-Load (ETL) processes
    + Experience in data pipelines development and tooling, e.g., Glue, Databricks, Synapse, or Dataproc
    + Experience with both relational and NoSQL databases, PostgreSQL, DB2, MongoDB
    + Excellent problem-solving, analytical, and critical thinking skills
    Preferred Skills and Experience
    + Experience working as a Data Engineer and/or in cloud modernization
    + Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes
    + Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization
    + Cloud platform certification, e.g., AWS Certified Data Analytics - Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate
    + Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio
    + Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology
    **Being You**
    Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way.
    **What You Can Expect**
    With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed.
    **Get Referred!**
    If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address.
    Kyndryl is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Kyndryl is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
    This advertiser has chosen not to accept applicants from your region.

    Gcp Data Engineer

    Mumbai, Maharashtra ₹1200000 - ₹3600000 Y Mondelez

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Are You Ready to Make It Happen at Mondelz International?

    Join our Mission to Lead the Future of Snacking. Make It With Pride.

    You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs

    How you will contribute

    You will:

    • Operationalize and automate activities for efficiency and timely production of data visuals
    • Assist in providing accessibility, retrievability, security and protection of data in an ethical manner
    • Search for ways to get new data sources and assess their accuracy
    • Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases
    • Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation
    • Validate information from multiple sources.
    • Assess issues that might prevent the organization from making maximum use of its information assets

    What you will bring

    A desire to drive your future and accelerate your career and the following experience and knowledge:

    • Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems
    • Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data
    • Ability to simplify complex problems and communicate to a broad audience

    Are You Ready to Make It Happen at Mondelz International?

    Join our Mission to Lead the Future of Snacking. Make It with Pride

    In This Role

    As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices.

    Role & Responsibilities:

    • Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions.
    • Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes.
    • Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity.
    • Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance.
    • Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices.

    Technical Requirements:

    • Programming: Python
    • Database: SQL, PL/SQL, Postgres SQL, Bigquery, Stored Procedure / Routines.
    • ETL & Integration: AecorSoft, Talend, DBT, Databricks (Optional), Fivetran.
    • Data Warehousing: SCD, Schema Types, Data Mart.
    • Visualization: PowerBI (Optional), Tableau (Optional), Looker.
    • GCP Cloud Services: Big Query, GCS.
    • Supply Chain: IMS + Shipment functional knowledge good to have.
    • Supporting Technologies: Erwin, Collibra, Data Governance, Airflow.

    Soft Skills:

    • Problem-Solving: The ability to identify and solve complex data-related challenges.
    • Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders.
    • Analytical Thinking: The capacity to analyze data and draw meaningful insights.
    • Attention to Detail: Meticulousness in data preparation and pipeline development.
    • Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field.
    This advertiser has chosen not to accept applicants from your region.

    GCP cloud engineer

    Mumbai, Maharashtra ₹1000000 - ₹1200000 Y KhushaTech IT Enterprises

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title:

    GCP Cloud Engineer / Cloud Resources Specialist – Malad (3–4 Years) | ₹10–12 LPA



    Job Description:

    We are looking for a skilled GCP Cloud Engineer to join our team in Malad. The ideal candidate will be responsible for designing, managing, optimizing, and securing Google Cloud Platform (GCP) infrastructure to ensure scalability, high availability, performance, and cost efficiency.

    If you are passionate about cloud technologies, automation, and DevOps practices, this role is for you



    Responsibilities:

    Design, implement, and manage GCP services including Compute Engine, Cloud Storage, VPC, IAM, Pub/Sub, GKE, BigQuery, Cloud Functions, and Cloud SQL.

    Deploy and configure VMs, load balancers, storage buckets, and networking.

    Implement security best practices across IAM, firewalls, encryption, and service accounts.

    Monitor system health, availability, and cost optimization with Cloud Monitoring & Logging.

    Automate infrastructure provisioning using Terraform / IaC tools.

    Collaborate with development teams to design scalable, secure cloud-native solutions.

    Troubleshoot and resolve issues in compute, networking, and storage.

    Support DevOps CI/CD pipelines and containerized workloads (Docker, Kubernetes).



    Requirements:

    3–4 years of hands-on experience with GCP core services.

    Strong knowledge of VPC, IAM, networking, firewalls, load balancing, VPNs.

    Experience with Terraform / IaC and Linux/Unix administration.

    Familiarity with CI/CD pipelines, scripting (Python/Bash/PowerShell).

    Knowledge of Cloud Monitoring, Logging, and Security Command Center.



    Good to Have:

    Google Cloud Certification (Associate/Professional).

    Experience with multi-cloud or hybrid cloud setups.

    Knowledge of BigQuery, Dataflow, Pub/Sub.

    Exposure to Kubernetes / Anthos.



    Education:

    Bachelor's in Computer Science, IT, or equivalent hands-on experience.



    Salary: ₹10 – 12 LPA

    Location: Malad (Mumbai)

    • Experience: 3–4 Years
    This advertiser has chosen not to accept applicants from your region.

    GCP Data Engineer

    Mumbai, Maharashtra ₹900000 - ₹1200000 Y Zediant Technologies

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title: GCP Data Engineer

    Experience: 5- 8+ Years

    Work Mode: Onsite

    Location: Kandivali, Mumbai

    We are looking for a highly skilled and motivated GCP Data Engineer to join our onsite team in Kandivali, Mumbai. The ideal candidate will have solid experience with Google Cloud Platform (GCP) tools and strong expertise in Python, data pipelines, and scalable API development.

    Roles & Responsibilities:

    • Design, build, and manage scalable and reliable data pipelines using GCP services.
    • Work with tools like BigQuery, Cloud Composer, Dataflow, Dataform, and Pub/Sub to develop robust data workflows.
    • Develop and maintain Python-based data solutions.
    • Design and implement CI/CD pipelines, leveraging Git and automation tools for efficient DevOps practices.
    • Build, manage, and scale RESTful APIs and services for data access and processing.
    • Collaborate with cross-functional teams to integrate data solutions into broader applications.
    • Optimize performance and scalability of data systems and APIs.

    Must-Have Skills:

    • Strong hands-on experience with GCP:

    • BigQuery, Cloud Composer, Dataflow, Dataform, Pub/Sub

    • Python programming for data processing and automation.

    • Experience with DevOps tools:

    • Git, CI/CD pipelines (Jenkins, GitHub Actions, etc.)

    • Solid understanding of data engineering principles and pipeline scalability.

    • Strong problem-solving skills and ability to work independently in an onsite environment.

    Nice to Have:

    • Experience with Elasticsearch
    • API performance optimization and monitoring experience
    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Gcp Jobs in Mumbai !

    GCP DevOps Engineer

    Mumbai, Maharashtra ₹2000000 - ₹2500000 Y Mondelez

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Roles & Responsibilities

    • Sets out DevOps Platform strategy and roadmap in line with wider Business & IT strategy to facilitate effective software deployment across the organization.
    • Leads the design and implementation of processes, practices and tools enabling CI/CD, maximizing the speed and quality of software delivery and minimizing unit costs across digital platforms.
    • Manages build automation, release, deployment, and configuration activities of the DevOps platform, setting appropriate milestones & KPIs and working within an Agile framework.
    • Define CICD / Automation security requirements (e.g. least privilege principle).
    • Define security testing requirements (such as static/dynamic analysis security testing, vulnerability scanning, penetration testing, etc.).
    • Promote "security-left" (security coding convention, unit tests, static analyzers), so that security should be built into the product, rather than applied to the finished product.
    • Manages tool and vendor selection across development automation activities to ensure that tooling is fit for purpose for an enterprise-grade software deployment automation & DevOps platform.
    • Owns, leads and drives systems engineering, software delivery automation and CI/CD practices across all digital platforms.
    • Follows industry trends and developments to better meet wider business & IT goals and ensure that the DevOps platform ecosystem is at the cutting edge of enterprise-grade software development automation systems.
    • Bringing in the right DevOps mindset among the teams developing software, which would mean conducting training on tools and processes that makes software development and testing easier.
    • Drive increased maturity on the cloud DevOps capabilities including advanced CI/CD and metrics.

    Qualifications:

    • Bachelors degree in computer science, Business, a related discipline, or equivalent work experience.
    • Good knowledge of DevOps CI/CD workflows, tools and integration points and experience integrating security into SDLC.
    • Proven track record of architecting/implementing end to end pipelines that cover the entire SDLC.
    • Experience of having Implemented CI / CD tools such as GitHub, Bamboo, Jenkins, Connect All and JFrog including the build of pipelines.
    • Drive continuous improvement for supported applications, in areas such as monitoring, operational task automation, continuous integration, deployments and performance tuning.
    • Investigate and resolve complex and multi-faceted issues, spanning the entire technology stack, which requires working across teams and technology boundaries.
    • Proactively improve key metrics, such as up-time, application performance, and other key operational SLOs, SLAs.
    • Organized and prioritized work to complete assignments in a timely, efficient manner.
    • Deep understanding of modern DevOps platform technologies incl. infrastructure-as-code and containers.
    • Good knowledge/experience of the Application Security space and tools.
    • Extensive experience with agile methodologies.
    • Extensive experience working in a cloud native environment.
    • Understanding of software engineering principals.
    • Knowledge of containerization tools and approaches would be desirable.
    • Ability to communicate and present technical concepts to both technical and non-technical audiences.
    • Ability to Influence others to achieve buy-in to proposed solutions / recommendations.
    This advertiser has chosen not to accept applicants from your region.

    GCP DevOps Engineer

    Mumbai, Maharashtra ₹1200000 - ₹3600000 Y Mondelez

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Are You Ready to Make It Happen at Mondelz International?

    Join our Mission to Lead the Future of Snacking. Make It with Pride.

    You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs

    How you will contribute

    You will:

    • Operationalize and automate activities for efficiency and timely production of data visuals
    • Assist in providing accessibility, retrievability, security and protection of data in an ethical manner
    • Search for ways to get new data sources and assess their accuracy
    • Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases
    • Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation
    • Validate information from multiple sources.
    • Assess issues that might prevent the organization from making maximum use of its information assets

    What you will bring

    A desire to drive your future and accelerate your career and the following experience and knowledge:

    • Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems
    • Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data
    • Ability to simplify complex problems and communicate to a broad audience

    Are You Ready to Make It Happen at Mondelz International?

    Join our Mission to Lead the Future of Snacking. Make It with Pride

    In This Role

    As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices.

    Role & Responsibilities:

    • Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions.
    • Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes.
    • Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity.
    • Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance.
    • Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices.

    Technical Requirements:

    • Programming: Python
    • Database: SQL, PL/SQL, Postgres SQL, Bigquery, Stored Procedure / Routines.
    • ETL & Integration: AecorSoft, Talend, DBT, Databricks (Optional), Fivetran.
    • Data Warehousing: SCD, Schema Types, Data Mart.
    • Visualization: PowerBI (Optional), Tableau (Optional), Looker.
    • GCP Cloud Services: Big Query, GCS.
    • Supply Chain: IMS + Shipment functional knowledge good to have.
    • Supporting Technologies: Erwin, Collibra, Data Governance, Airflow.

    Soft Skills:

    • Problem-Solving: The ability to identify and solve complex data-related challenges.
    • Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders.
    • Analytical Thinking: The capacity to analyze data and draw meaningful insights.
    • Attention to Detail: Meticulousness in data preparation and pipeline development.
    • Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field.

    Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy

    This advertiser has chosen not to accept applicants from your region.

    Gcp Data Engineer

    Navi Mumbai, Maharashtra ₹1500000 - ₹2500000 Y Niveus Solution

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    • 5+ yrs of IT experience
    • Good understanding of analytics tools for effective analysis of data
    • Should have been part of production deployment team, Production Support team
    • Experience with Big Data tools-Hadoop, Spark, Apache Beam, Kafka etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
    • Experience on any DW tools like BQ, Redshift, Synapse or snowflakes
    • Experience in ETL and Data Warehousing.
    • Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra etc.
    • Experience with cloud platforms like GCP
    • Experience with workflow management using tools like Apache Airflow

    Role & responsibilities

    • Develop high performance and scalable solutions using GCP that extract, transform, and load big data.
    • Designing and building production-grade data solutions from ingestion to consumption using Java / Python
    • Design and optimize data models on GCP cloud using GCP data stores such as BigQuery
    • Should be able to handle deployment process
    • Optimizing data pipelines for performance and cost for large scale data lakes.
    • Writing complex, highly-optimized queries across large data sets and to create data processing layers.
    • Closely interact with Data Engineers to identify right tools to deliver product features by performing POC
    • Collaborative team player that interacts with business, BAs and other Data/ML engineers
    • Research new use cases for existing data.

    Preferred candidate profile

    • Need to be Aware of Design Best practices for OLTP and OLAP Systems
    • Should be part of team designing the DB and pipeline
    • Should have exposure to Load testing methodologies, Debugging pipelines and Delta load handling
    • Worked on heterogeneous migration projects
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Gcp Jobs View All Jobs in Mumbai