AWS Cloud infrastructure Engineer
Posted today
Job Viewed
Job Description
• The candidate should have prior experience with AWS and Azure and have conducted development in serverless infrastructure.
• The State of NJ is seeking an AWS Cloud Engineer that will assist in maintaining current AWS Cloud serverless infrastructure as well as build or assist in building new serverless Infrastructure.
• Additional Cloud-based tools experience is considered to be important (see skills section)
• Additional desired skills include experience with DevOps, AWS DevOps, AWS code pipeline, although neither are required.
Pyspark, AWS

Posted today
Job Viewed
Job Description
We are seeking a highly skilled Sr. Developer with 8 to 12 years of experience in Big Data and AWS technologies. The ideal candidate will work in a hybrid model utilizing their expertise in AWS EC2 AWS EMR Amazon S3 Apache Spark and Python to drive innovative solutions. This role offers the opportunity to make a significant impact on our projects and contribute to the companys success.
**Responsibilities**
+ Develop and implement scalable big data solutions using AWS EC2 AWS EMR and Amazon S3 to enhance data processing capabilities.
+ Collaborate with cross-functional teams to design and optimize data pipelines ensuring efficient data flow and storage.
+ Utilize Apache Spark to perform complex data transformations and analytics delivering actionable insights for business decisions.
+ Write and maintain high-quality Python code to automate data processing tasks and improve system performance.
+ Monitor and troubleshoot data processing workflows ensuring reliability and accuracy of data outputs.
+ Participate in code reviews and provide constructive feedback to peers fostering a culture of continuous improvement.
+ Stay updated with the latest industry trends and technologies applying new knowledge to enhance existing systems.
+ Ensure data security and compliance with industry standards protecting sensitive information and maintaining trust.
+ Contribute to the development of best practices and standards for big data processing and cloud computing.
+ Provide technical guidance and support to junior developers sharing expertise and promoting skill development.
+ Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
+ Optimize resource utilization on AWS platforms reducing costs and improving system efficiency.
+ Document technical specifications and project progress ensuring clear communication and alignment with project goals.
**Qualifications**
+ Possess a strong understanding of big data technologies and cloud computing particularly AWS services.
+ Demonstrate proficiency in Apache Spark and Python with a proven track record of successful project implementations.
+ Exhibit excellent problem-solving skills and the ability to work independently in a hybrid work model.
+ Show experience in designing and optimizing data pipelines for large-scale data processing.
+ Have a keen eye for detail and a commitment to delivering high-quality results.
+ Display effective communication skills both written and verbal to collaborate with diverse teams.
**Certifications Required**
AWS Certified Big Data Specialty Apache Spark Developer Certification
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
AWS Services

Posted today
Job Viewed
Job Description
We are seeking a highly skilled Sr. Support Analyst with 4 to 11 years of experience to join our team. The ideal candidate will have expertise in AWS Lambda AWS IAM Apache Spark Python DB Performance Optimization Data Modelling Amazon S3 and TSQL. Experience in Property & Casualty Insurance is a plus. This is a hybrid role with rotational shifts and no travel required. Should have strong experience in supporting existing environments in above technologies
**Responsibilities**
+ Develop and maintain scalable applications using AWS Lambda and other AWS services.
+ Implement and manage AWS IAM policies to ensure secure access control.
+ Providing support for the already developed framework and ensuring continuity of applications during support work
+ Working with customer and on: offshore team members on resolving failures and coming up with permanent fixes.
+ Oversee the development and maintenance of data pipelines and workflows
+ Providing automation and improvement recommendations and tracking implementation to closure.
+ Preparing timeline for L3 development and tracking to closure.
+ Utilize Apache Spark for large-scale data processing and analytics.
+ Write efficient and maintainable code in Python for various applications.
+ Optimize database performance to ensure fast and reliable data retrieval.
+ Design and implement data models to support business requirements.
+ Manage and maintain data storage solutions using Amazon S3.
+ Write complex TSQL queries for data manipulation and reporting.
+ Collaborate with cross-functional teams to gather and analyze requirements.
+ Provide technical guidance and mentorship to junior developers.
+ Ensure code quality through code reviews and automated testing.
+ Troubleshoot and resolve issues in a timely manner.
+ Stay updated with the latest industry trends and technologies.
**Qualifications**
+ Must have 8 to 10 years of experience in software development.
+ Must have strong experience with AWS Lambda and AWS IAM.
+ Must have expertise in Apache Spark and Python.
+ Must have experience in DB Performance Optimization and Data Modelling.
+ Must have experience with Amazon S3 and TSQL.
+ Nice to have experience in Property & Casualty Insurance.
+ Must be able to work in a hybrid model with rotational shifts.
+ Must have excellent problem-solving and analytical skills.
+ Must have strong communication and collaboration skills.
+ Must be able to work independently and as part of a team.
+ Must be detail-oriented and able to manage multiple tasks.
+ Must be committed to continuous learning and improvement.
+ Must have a proactive and positive attitude.
**Certifications Required**
AWS Certified Developer Apache Spark Certification Python Certification
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
AWS Neptune
Posted 1 day ago
Job Viewed
Job Description
Technical/Functional Skills -
Proficiency: AWS Neptune Database, AWS Neptune Analytics , S3, Snowflake, Graph database, AWS Cloud
Roles & Responsibilities
- 6+ years of experience
- Identify and select the data sources, Tables, Relationships between different entities in Snowflake.
- Establish a secure connection between Snowflake and AWS Neptune via AWS service stack by securing extracted data storing in S3 for transformation before loading into AWS Neptune.
- Collaborate with current platform team to understanding data structure to implement data extraction processes from Snowflake.
- Load the extracted data into AWS S3 for graph model transformation.
- Analyze the data, relationships, entities mappings to determine the necessary graph schema.
- Design nodes, edges, and properties for the graph schema using entities definitions.
- Implement the graph schema in AWS Neptune.
- Creating indexes to improve query performance
- Review and refine schema based on query patterns and performance based on mutually agreed design
- Define the transformation logic using DBT.
- Develop DBT models for data transformation.
- Schedule and automate the ELT pipeline using DBT and Snowflake.
AWS Neptune
Posted 1 day ago
Job Viewed
Job Description
Greetings from TCS!
TCS is hiring for AWS Neptune
Required experience range: 6 to 10 years
Job location: Hyderabad
Functional Skills: AWS Neptune Database, AWS Neptune Analytics, S3, Snowflake, Graph database, AWS Cloud
Roles & responsibilities:
- Identify and select the data sources, Tables, Relationships between different entities in Snowflake.
- Establish a secure connection between Snowflake and AWS Neptune via AWS service stack by securing extracted data storing in S3 for transformation before loading into AWS Neptune.
- Collaborate with current platform team to understanding data structure to implement data extraction processes from Snowflake.
- Load the extracted data into AWS S3 for graph model transformation.
- Analyse the data, relationships, entities mappings to determine the necessary graph schema.
- Design nodes, edges, and properties for the graph schema using entities definitions.
- Implement the graph schema in AWS Neptune.
- Creating indexes to improve query performance
- Review and refine schema based on query patterns and performance based on mutually agreed design
- Define the transformation logic using DBT.
- Develop DBT models for data transformation.
- Schedule and automate the ELT pipeline using DBT and Snowflake.
Thank you
Bodhisatwa Ray
AWS Engineer
Posted 3 days ago
Job Viewed
Job Description
Role Developer
Required Technical Skill Set AWS & Python
Desired Experience Range: 6-10 years
Location of Requirement: Hyderabad
Desired Competencies (Technical/Behavioral Competency)
Must-Have ·
Python
· Open-source technologies and Amazon AWS
· Hosting on Amazon AWS · Skills Ø Git, Gitlab, Gitlab CI, PyCharm, Conda, Vagrant, VirtualBox Ansible, Docker, Docker Swam , DockerHub AWS S3, Cloudfront, EC2, KMS, EBS, EFS, RDS, SNS, SES, ELB, Route53 MariaDB, MongDB, InfluxDB, Elastic Search, Elastic Cloud, Mongo-Engine, sqlAlchemy Auth2, OpenIDConnect, Okta, SSO RESTful webservices
Good-to-Have
· NewRelic, Fluent
· Nginx, Flask, uwsgi, swagger, celery, pytest
Java AWS
Posted 3 days ago
Job Viewed
Job Description
Job Title: Java + AWS
Location: Hyderabad, Noida
Experience: 5-8 years
Notice Period: Only Immediate Joiner preferred
Job Description:
- Backend developer with 5- 7 years’ experience with core Java, Spring Boot, Rest API & Micro services
- Experience in developing, packaging, configuring deploying, operating, and maintaining microservices written in Java, in a cloud-native environment(AWS Cloud – EKS, EC2, S3
- Understanding of software engineering concepts and responsible for working on full life cycle engineering efforts using Agile methodologies, object-oriented design, and accepted design patterns and practices.
- Experience working with Java, Restful microservices, application development skills, and the ability to solve complex application and platform problems. Understanding on Apache, Load balancer configurations.
- Understanding on DevSecOps – CI/CD pipelines and security tools(Veracode/Blackduck/Sonar/Xray).
- Able to write stored procedures in Oracle DB, Optimize queries.
- Able to troubleshoot issues across environments. Good experience in monitoring tools, content management.
Be The First To Know
About the latest Aws Jobs in Hyderabad !
AWS Engineer
Posted today
Job Viewed
Job Description
Required Technical Skill Set AWS & Python
Desired Experience Range: 6-10 years
Location of Requirement: Hyderabad
Desired Competencies (Technical/Behavioral Competency)
Must-Have ·
Python
· Open-source technologies and Amazon AWS
· Hosting on Amazon AWS · Skills Ø Git, Gitlab, Gitlab CI, PyCharm, Conda, Vagrant, VirtualBox Ansible, Docker, Docker Swam , DockerHub AWS S3, Cloudfront, EC2, KMS, EBS, EFS, RDS, SNS, SES, ELB, Route53 MariaDB, MongDB, InfluxDB, Elastic Search, Elastic Cloud, Mongo-Engine, sqlAlchemy Auth2, OpenIDConnect, Okta, SSO RESTful webservices
Good-to-Have
· NewRelic, Fluent
· Nginx, Flask, uwsgi, swagger, celery, pytest
Java AWS
Posted today
Job Viewed
Job Description
Location: Hyderabad, Noida
Experience: 5-8 years
Notice Period: Only Immediate Joiner preferred
Job Description:
- Backend developer with 5- 7 years’ experience with core Java, Spring Boot, Rest API & Micro services
- Experience in developing, packaging, configuring deploying, operating, and maintaining microservices written in Java, in a cloud-native environment(AWS Cloud – EKS, EC2, S3
- Understanding of software engineering concepts and responsible for working on full life cycle engineering efforts using Agile methodologies, object-oriented design, and accepted design patterns and practices.
- Experience working with Java, Restful microservices, application development skills, and the ability to solve complex application and platform problems. Understanding on Apache, Load balancer configurations.
- Understanding on DevSecOps – CI/CD pipelines and security tools(Veracode/Blackduck/Sonar/Xray).
- Able to write stored procedures in Oracle DB, Optimize queries.
- Able to troubleshoot issues across environments. Good experience in monitoring tools, content management.
AWS Neptune
Posted today
Job Viewed
Job Description
Proficiency: AWS Neptune Database, AWS Neptune Analytics, S3, Snowflake, Graph database, AWS Cloud
Roles & Responsibilities
- 6+ years of experience
- Identify and select the data sources, Tables, Relationships between different entities in Snowflake.
- Establish a secure connection between Snowflake and AWS Neptune via AWS service stack by securing extracted data storing in S3 for transformation before loading into AWS Neptune.
- Collaborate with current platform team to understanding data structure to implement data extraction processes from Snowflake.
- Load the extracted data into AWS S3 for graph model transformation.
- Analyze the data, relationships, entities mappings to determine the necessary graph schema.
- Design nodes, edges, and properties for the graph schema using entities definitions.
- Implement the graph schema in AWS Neptune.
- Creating indexes to improve query performance
- Review and refine schema based on query patterns and performance based on mutually agreed design
- Define the transformation logic using DBT.
- Develop DBT models for data transformation.
- Schedule and automate the ELT pipeline using DBT and Snowflake.