Cloud Computing Architect Position

Coimbatore, Tamil Nadu beBeeAWS

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title:

AWS Platform Architect Role


About the Job:

We are seeking an experienced AWS Platform Architect to lead the design, development, and implementation of scalable and secure solutions on the AWS platform. The ideal candidate will bring deep expertise in Amazon Redshift architecture, data modelling, performance tuning, and analytics to support business intelligence and advanced analytics needs.

This role requires a strong understanding of cloud architecture, data warehousing, and big data technologies. The successful candidate will have experience with AWS services such as S3, Glue, Lambda, CloudWatch, and Athena.

The primary responsibility of this role is to design end-to-end data warehouse solutions on AWS Redshift. This includes establishing best practices for schema design, workload management, and query optimization.

In addition to designing and implementing scalable, reliable, and secure cloud architecture on AWS, the ideal candidate will also define cloud architecture standards, policies, and best practices.

This role involves developing and deploying AWS-based solutions, leveraging services such as Redshift, S3, RDS, and more. It also requires working with ETL tools and frameworks to automate and optimize data integration processes.

Key Performance Indicators (KPIs) include cost optimization, performance improvement, and security enhancement. The successful candidate will be responsible for monitoring and tuning Redshift clusters to improve query performance and resource utilization.

Furthermore, this role involves collaboration with development, DevOps, Orchestration, and operations teams to integrate AWS services into the overall IT ecosystem. Guiding and mentoring junior engineers in Redshift and AWS technologies is also an essential part of this role.

Another key aspect of this position is ensuring compliance with data governance and industry standards. This involves designing security measures for Redshift, including IAM roles, encryption, and access control.

Maintenance tasks such as cluster resizing and version upgrades are also part of this role. Additionally, setting up and managing monitoring tools for cloud resources, such as AWS CloudWatch, is crucial for this position.

To excel in this role, one should possess strong problem-solving and analytical skills. Excellent communication and collaboration abilities are also essential. The ability to work in an agile, fast-paced environment is critical for success in this position.

),
This advertiser has chosen not to accept applicants from your region.

Cloud Engineer

Coimbatore, Tamil Nadu Deltacubes

Posted today

Job Viewed

Tap Again To Close

Job Description

Hiring - MS Azure Fabric Engineer | Independent Consultant (WFH-Remote)



Greetings from Deltacubes Technology!



MS Azure Fabric



Experience:



6+ years



Thanks


Deena



This advertiser has chosen not to accept applicants from your region.

Cloud Engineer

Coimbatore, Tamil Nadu Explore Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Contract Opportunity: Data Engineer (DBT, Databricks, Azure) – Remote


We’re hiring a skilled Data Engineer for a 6-month fully remote contract to support our growing data team. You'll work on mission-critical data transformation pipelines using DBT , Databricks , and Azure .


What you’ll do:

  • Design and build scalable, maintainable data models using DBT
  • Develop efficient pipelines across cloud infrastructure
  • Integrate and transform diverse data sources for analytics
  • Collaborate with analysts, scientists, and QA teams to ensure data accuracy


What we’re looking for:

  • Proven experience with Databricks and Spark
  • Strong knowledge of Azure data services
  • Familiarity with Git , version control, and CI/CD for data workflows
  • Solid problem-solving skills and attention to data integrity


Location: Fully Remote (Preference for India-based candidates)


Duration: 6 months rolling

This advertiser has chosen not to accept applicants from your region.

Azure Cloud Engineer

Coimbatore, Tamil Nadu Insight Global

Posted today

Job Viewed

Tap Again To Close

Job Description

JOB DESCRIPTION

A global manufacturing company is seeking an Azure Infrastructure Engineer to join their enterprise IT organization! As a cloud engineer in this space, this individual will be responsible for supporting ongoing and new initiatives on an already existing cloud platform. The ideal candidate will have an expertise in Azure engineering and should be well-versed in one of the following areas: SAP Basis infrastructure engineering. Additionally, this individual should have strong communication skills and previous experience supporting an enterprise organization with cross functional teams. This is an immediate need, based in India, and will be a 6 month contract to start with extensions.

REQUIRED SKILLS AND EXPERIENCE

- 7-10 years of experience in Azure Cloud Engineering - Hands on engineering experience with SAP Basis (infrastructure) - Powershell scripting skills - Experience working in an enterprise, global organization, supporting enterprise projects - Excellent communication skills



Compensation :

$15/hr to $20/hr.

Exact compensation may vary based on several factors, including skills, experience, and education.

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.

This advertiser has chosen not to accept applicants from your region.

Cloud Engineer-GCP

Coimbatore, Tamil Nadu EXL

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Requirements


Technical Skills

  • Expert in GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, and Cloud Functions. GCP Professional Data Engineer Certification is highly favourable.
  • Advanced knowledge of SQL for complex data transformation and query optimization.
  • Proven experience in Python for scalable data pipeline development and orchestration following best practices.
  • Experience implementing Terraform for Infrastructure as Code (IaC) to automate GCP resource management.
  • Knowledge of CI/CD pipelines and automated deployment practices.
  • Experience with containerization technologies (e.g., Docker, Kubernetes)
  • Experience building and optimizing batch and streaming data pipelines.
  • Understanding of data governance principles, GCP security (IAM, VPC), and compliance requirements.

Soft Skills

  • Demonstrates a growth mindset by actively seeking to learn from peers and stakeholders, fostering a culture of open communication and shared knowledge.
  • Works effectively across teams, including Data Science, Engineering, and Analytics, to understand their needs and deliver impactful data solutions.
  • Actively participates in design discussions, brainstorming sessions, and cross-functional projects, always striving for continuous improvement and innovation.
  • Builds strong relationships across the organization, using empathy and active listening to ensure alignment on goals and deliverables.
  • Approaches challenges with a growth mindset , viewing obstacles as opportunities to innovate and improve processes.
  • Applies a structured and analytical approach to solving complex problems, balancing immediate needs with long-term scalability and efficiency.
  • Demonstrates resilience under pressure, maintaining a positive and solution-focused attitude when faced with tight deadlines or ambiguity.
  • Actively seeks feedback and lessons learned from past projects to continuously refine problem-solving strategies and improve outcomes.
  • Shares expertise generously, guiding team members in adopting best practices and helping them overcome technical challenges.
  • Leads by example, demonstrating how to approach complex problems pragmatically while promoting curiosity and a willingness to explore new tools and technologies.

Encourages professional development within the team, supporting individuals in achieving their career goals and obtaining certifications, especially within the Google Cloud ecosystem.


Main duties and responsibilities


  • Design, develop, and maintain scalable data pipelines using modern data engineering tools and technologies on our GCP stack.
  • Build and optimize our lake house on Google Cloud Platform (GCP)
  • Implement data ingestion, transformation, and loading processes for various data sources (e.g., databases, APIs, cloud storage)
  • Ensure data quality, consistency, and security throughout the data pipeline
  • Leverage GCP services (e.g., Dataflow, Dataproc, BigQuery, Cloud Storage) to build and maintain cloud-native data solutions
  • Implement infrastructure as code (IaC) principles using Terraform to automate provisioning and configuration
  • Manage and optimize cloud resources to ensure cost-efficiency and performance
  • Design and implement efficient data models following a star schema approach to support analytical and operational workloads
  • Collaborate with data analysts to develop advanced analytics solutions.
  • Conduct data quality analysis to drive better data management on outputs in our Curated Layer.
  • Mentor junior data engineers and provide technical guidance
  • Contribute to the development of data engineering best practices and standards
  • Collaborate with cross-functional teams to deliver complex data projects
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Cloud computing Jobs in Coimbatore !

 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Cloud Computing Jobs View All Jobs in Coimbatore