31 Cloud Computing jobs in Coimbatore
Cloud Computing Architect Position
Posted today
Job Viewed
Job Description
AWS Platform Architect Role
About the Job:
We are seeking an experienced AWS Platform Architect to lead the design, development, and implementation of scalable and secure solutions on the AWS platform. The ideal candidate will bring deep expertise in Amazon Redshift architecture, data modelling, performance tuning, and analytics to support business intelligence and advanced analytics needs.
This role requires a strong understanding of cloud architecture, data warehousing, and big data technologies. The successful candidate will have experience with AWS services such as S3, Glue, Lambda, CloudWatch, and Athena.
The primary responsibility of this role is to design end-to-end data warehouse solutions on AWS Redshift. This includes establishing best practices for schema design, workload management, and query optimization.
In addition to designing and implementing scalable, reliable, and secure cloud architecture on AWS, the ideal candidate will also define cloud architecture standards, policies, and best practices.
This role involves developing and deploying AWS-based solutions, leveraging services such as Redshift, S3, RDS, and more. It also requires working with ETL tools and frameworks to automate and optimize data integration processes.
Key Performance Indicators (KPIs) include cost optimization, performance improvement, and security enhancement. The successful candidate will be responsible for monitoring and tuning Redshift clusters to improve query performance and resource utilization.
Furthermore, this role involves collaboration with development, DevOps, Orchestration, and operations teams to integrate AWS services into the overall IT ecosystem. Guiding and mentoring junior engineers in Redshift and AWS technologies is also an essential part of this role.
Another key aspect of this position is ensuring compliance with data governance and industry standards. This involves designing security measures for Redshift, including IAM roles, encryption, and access control.
Maintenance tasks such as cluster resizing and version upgrades are also part of this role. Additionally, setting up and managing monitoring tools for cloud resources, such as AWS CloudWatch, is crucial for this position.
To excel in this role, one should possess strong problem-solving and analytical skills. Excellent communication and collaboration abilities are also essential. The ability to work in an agile, fast-paced environment is critical for success in this position.
),Cloud Engineer
Posted today
Job Viewed
Job Description
Hiring - MS Azure Fabric Engineer | Independent Consultant (WFH-Remote)
Greetings from Deltacubes Technology!
MS Azure Fabric
Experience:
6+ years
Thanks
Deena
Cloud Engineer
Posted today
Job Viewed
Job Description
Contract Opportunity: Data Engineer (DBT, Databricks, Azure) – Remote
We’re hiring a skilled Data Engineer for a 6-month fully remote contract to support our growing data team. You'll work on mission-critical data transformation pipelines using DBT , Databricks , and Azure .
What you’ll do:
- Design and build scalable, maintainable data models using DBT
- Develop efficient pipelines across cloud infrastructure
- Integrate and transform diverse data sources for analytics
- Collaborate with analysts, scientists, and QA teams to ensure data accuracy
What we’re looking for:
- Proven experience with Databricks and Spark
- Strong knowledge of Azure data services
- Familiarity with Git , version control, and CI/CD for data workflows
- Solid problem-solving skills and attention to data integrity
Location: Fully Remote (Preference for India-based candidates)
Duration: 6 months rolling
Azure Cloud Engineer
Posted today
Job Viewed
Job Description
JOB DESCRIPTION
A global manufacturing company is seeking an Azure Infrastructure Engineer to join their enterprise IT organization! As a cloud engineer in this space, this individual will be responsible for supporting ongoing and new initiatives on an already existing cloud platform. The ideal candidate will have an expertise in Azure engineering and should be well-versed in one of the following areas: SAP Basis infrastructure engineering. Additionally, this individual should have strong communication skills and previous experience supporting an enterprise organization with cross functional teams. This is an immediate need, based in India, and will be a 6 month contract to start with extensions.
REQUIRED SKILLS AND EXPERIENCE
- 7-10 years of experience in Azure Cloud Engineering - Hands on engineering experience with SAP Basis (infrastructure) - Powershell scripting skills - Experience working in an enterprise, global organization, supporting enterprise projects - Excellent communication skills
Compensation :
$15/hr to $20/hr.
Exact compensation may vary based on several factors, including skills, experience, and education.
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.
Cloud Engineer-GCP
Posted today
Job Viewed
Job Description
Key Requirements
Technical Skills
- Expert in GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, and Cloud Functions. GCP Professional Data Engineer Certification is highly favourable.
- Advanced knowledge of SQL for complex data transformation and query optimization.
- Proven experience in Python for scalable data pipeline development and orchestration following best practices.
- Experience implementing Terraform for Infrastructure as Code (IaC) to automate GCP resource management.
- Knowledge of CI/CD pipelines and automated deployment practices.
- Experience with containerization technologies (e.g., Docker, Kubernetes)
- Experience building and optimizing batch and streaming data pipelines.
- Understanding of data governance principles, GCP security (IAM, VPC), and compliance requirements.
Soft Skills
- Demonstrates a growth mindset by actively seeking to learn from peers and stakeholders, fostering a culture of open communication and shared knowledge.
- Works effectively across teams, including Data Science, Engineering, and Analytics, to understand their needs and deliver impactful data solutions.
- Actively participates in design discussions, brainstorming sessions, and cross-functional projects, always striving for continuous improvement and innovation.
- Builds strong relationships across the organization, using empathy and active listening to ensure alignment on goals and deliverables.
- Approaches challenges with a growth mindset , viewing obstacles as opportunities to innovate and improve processes.
- Applies a structured and analytical approach to solving complex problems, balancing immediate needs with long-term scalability and efficiency.
- Demonstrates resilience under pressure, maintaining a positive and solution-focused attitude when faced with tight deadlines or ambiguity.
- Actively seeks feedback and lessons learned from past projects to continuously refine problem-solving strategies and improve outcomes.
- Shares expertise generously, guiding team members in adopting best practices and helping them overcome technical challenges.
- Leads by example, demonstrating how to approach complex problems pragmatically while promoting curiosity and a willingness to explore new tools and technologies.
Encourages professional development within the team, supporting individuals in achieving their career goals and obtaining certifications, especially within the Google Cloud ecosystem.
Main duties and responsibilities
- Design, develop, and maintain scalable data pipelines using modern data engineering tools and technologies on our GCP stack.
- Build and optimize our lake house on Google Cloud Platform (GCP)
- Implement data ingestion, transformation, and loading processes for various data sources (e.g., databases, APIs, cloud storage)
- Ensure data quality, consistency, and security throughout the data pipeline
- Leverage GCP services (e.g., Dataflow, Dataproc, BigQuery, Cloud Storage) to build and maintain cloud-native data solutions
- Implement infrastructure as code (IaC) principles using Terraform to automate provisioning and configuration
- Manage and optimize cloud resources to ensure cost-efficiency and performance
- Design and implement efficient data models following a star schema approach to support analytical and operational workloads
- Collaborate with data analysts to develop advanced analytics solutions.
- Conduct data quality analysis to drive better data management on outputs in our Curated Layer.
- Mentor junior data engineers and provide technical guidance
- Contribute to the development of data engineering best practices and standards
- Collaborate with cross-functional teams to deliver complex data projects
Be The First To Know
About the latest Cloud computing Jobs in Coimbatore !