1,687 Google Cloud jobs in India

Google cloud

Pune, Maharashtra Impetus

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata


Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.

Google cloud

Pune, Maharashtra ₹2000000 - ₹2500000 Y Impetus

Posted today

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata

Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.

Google cloud

Pune, Maharashtra Impetus

Posted today

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata

Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task
This advertiser has chosen not to accept applicants from your region.

Google cloud

Pune, Maharashtra Impetus

Posted today

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata


Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.

Google cloud

Palakkad, Kerala Impetus

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata


Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.

Google cloud

Alappuzha, Kerala Impetus

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata


Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.

Google cloud

Anand, Gujarat Impetus

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata


Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Google cloud Jobs in India !

Google cloud

Malappuram, Kerala Impetus

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata


Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.

Google cloud

Vellore, Tamil Nadu Impetus

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata


Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.

Google cloud

Erode, Tamil Nadu Impetus

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

GCP/Bigdata


Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.

Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)

Should be aware with columnar database e.g parquet, ORC etc

Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).

Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems

Expert-level proficiency in at-least 4-5 GCP services

Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities

Strong understanding and experience in distributed computing frameworks, particularly

Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Google Cloud Jobs