Python, AWS
Posted today
Job Viewed
Job Description
**Skill: Python, AWS**
**Experiences: 6 to 13 years**
**Location: AIA Kochi**
We are seeking a highly skilled Sr. Developer with 6 to 13 years of experience to join our team. The ideal candidate will have expertise in Python ETL Oracle Snowflake Streams Snowflake Tasks Snowflake SQL and Snowflake. Additionally experience in the Property & Casualty Insurance domain is mandatory. This is a hybrid work model with day shifts and no travel requirements.
**Responsibilities**
As a Data Engineer you ll be responsible for acquiring curating and publishing data for analytical or operational uses. Data should be in a ready-to-use form that creates a single version of the truth across all data consumers including business users data scientists and Technology. Ready-to-use data can be for both real time and batch data processes and may include unstructured data. Successful data engineers have the skills typically required for the full lifecycle software engineering development from translating requirements into design development testing deployment and production maintenance tasks. Youll have the opportunity to work with various technologies from big data relational and SQL databases unstructured data technology and programming languages
+ Develop and maintain ETL processes to ensure data is accurately and efficiently loaded into the data warehouse.
+ Design and implement solutions using Python to automate data processing tasks.
+ Utilize Oracle databases to manage and retrieve data as needed.
+ Implement and manage Snowflake Streams to capture and process changes in real-time.
+ Create and manage Snowflake Tasks to automate and schedule data processing workflows.
+ Write and optimize Snowflake SQL queries to ensure efficient data retrieval and manipulation.
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
+ Ensure data integrity and quality by implementing robust validation and error-handling mechanisms.
+ Provide technical support and troubleshooting for data-related issues.
+ Stay updated with the latest industry trends and technologies to continuously improve the data processing framework.
+ Document technical specifications and processes to ensure knowledge sharing and continuity.
+ Conduct code reviews and provide constructive feedback to team members.
+ Mentor junior developers and share best practices to foster a collaborative learning environment.
**Qualifications**
+ Must have strong experience in Python for data processing and automation.
+ Must have extensive experience with ETL processes and tools.
+ Must have hands-on experience with Oracle databases.
+ Must be proficient in Snowflake Streams Snowflake Tasks and Snowflake SQL.
+ Must have domain experience in Property & Casualty Insurance.
+ Nice to have experience with other data warehousing solutions.
+ Nice to have experience with cloud platforms and services.
+ Must have excellent problem-solving and analytical skills.
+ Must have strong communication and collaboration skills.
+ Must be able to work independently and as part of a team.
+ Must be detail-oriented and committed to delivering high-quality work.
+ Must be adaptable and open to learning new technologies and methodologies.
+ Must have a proactive approach to identifying and addressing potential issues.
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
Pyspark, AWS
Posted today
Job Viewed
Job Description
We are seeking a highly skilled Sr. Developer with 8 to 12 years of experience in Big Data and AWS technologies. The ideal candidate will work in a hybrid model utilizing their expertise in AWS EC2 AWS EMR Amazon S3 Apache Spark and Python to drive innovative solutions. This role offers the opportunity to make a significant impact on our projects and contribute to the companys success.
**Responsibilities**
+ Develop and implement scalable big data solutions using AWS EC2 AWS EMR and Amazon S3 to enhance data processing capabilities.
+ Collaborate with cross-functional teams to design and optimize data pipelines ensuring efficient data flow and storage.
+ Utilize Apache Spark to perform complex data transformations and analytics delivering actionable insights for business decisions.
+ Write and maintain high-quality Python code to automate data processing tasks and improve system performance.
+ Monitor and troubleshoot data processing workflows ensuring reliability and accuracy of data outputs.
+ Participate in code reviews and provide constructive feedback to peers fostering a culture of continuous improvement.
+ Stay updated with the latest industry trends and technologies applying new knowledge to enhance existing systems.
+ Ensure data security and compliance with industry standards protecting sensitive information and maintaining trust.
+ Contribute to the development of best practices and standards for big data processing and cloud computing.
+ Provide technical guidance and support to junior developers sharing expertise and promoting skill development.
+ Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
+ Optimize resource utilization on AWS platforms reducing costs and improving system efficiency.
+ Document technical specifications and project progress ensuring clear communication and alignment with project goals.
**Qualifications**
+ Possess a strong understanding of big data technologies and cloud computing particularly AWS services.
+ Demonstrate proficiency in Apache Spark and Python with a proven track record of successful project implementations.
+ Exhibit excellent problem-solving skills and the ability to work independently in a hybrid work model.
+ Show experience in designing and optimizing data pipelines for large-scale data processing.
+ Have a keen eye for detail and a commitment to delivering high-quality results.
+ Display effective communication skills both written and verbal to collaborate with diverse teams.
**Certifications Required**
AWS Certified Big Data Specialty Apache Spark Developer Certification
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
AWS Neptune
Posted 3 days ago
Job Viewed
Job Description
Technical/Functional Skills -
Proficiency: AWS Neptune Database, AWS Neptune Analytics , S3, Snowflake, Graph database, AWS Cloud
Roles & Responsibilities
- 6+ years of experience
- Identify and select the data sources, Tables, Relationships between different entities in Snowflake.
- Establish a secure connection between Snowflake and AWS Neptune via AWS service stack by securing extracted data storing in S3 for transformation before loading into AWS Neptune.
- Collaborate with current platform team to understanding data structure to implement data extraction processes from Snowflake.
- Load the extracted data into AWS S3 for graph model transformation.
- Analyze the data, relationships, entities mappings to determine the necessary graph schema.
- Design nodes, edges, and properties for the graph schema using entities definitions.
- Implement the graph schema in AWS Neptune.
- Creating indexes to improve query performance
- Review and refine schema based on query patterns and performance based on mutually agreed design
- Define the transformation logic using DBT.
- Develop DBT models for data transformation.
- Schedule and automate the ELT pipeline using DBT and Snowflake.
AWS Neptune
Posted 3 days ago
Job Viewed
Job Description
Greetings from TCS!
TCS is hiring for AWS Neptune
Required experience range: 6 to 10 years
Job location: Hyderabad
Functional Skills: AWS Neptune Database, AWS Neptune Analytics, S3, Snowflake, Graph database, AWS Cloud
Roles & responsibilities:
- Identify and select the data sources, Tables, Relationships between different entities in Snowflake.
- Establish a secure connection between Snowflake and AWS Neptune via AWS service stack by securing extracted data storing in S3 for transformation before loading into AWS Neptune.
- Collaborate with current platform team to understanding data structure to implement data extraction processes from Snowflake.
- Load the extracted data into AWS S3 for graph model transformation.
- Analyse the data, relationships, entities mappings to determine the necessary graph schema.
- Design nodes, edges, and properties for the graph schema using entities definitions.
- Implement the graph schema in AWS Neptune.
- Creating indexes to improve query performance
- Review and refine schema based on query patterns and performance based on mutually agreed design
- Define the transformation logic using DBT.
- Develop DBT models for data transformation.
- Schedule and automate the ELT pipeline using DBT and Snowflake.
Thank you
Bodhisatwa Ray
AWS Architect
Posted today
Job Viewed
Job Description
Core responsibilities
- Design cloud architecture: Create and design resilient, high-performing, and cost-optimized cloud solutions that meet a company's business and technical requirements.
- Develop and deploy solutions: Oversee the implementation and deployment of new cloud infrastructure and applications using AWS services.
- Migrate systems: Strategize and execute the migration of existing on-premises systems and applications to the AWS cloud.
- Advise on best practices: Provide guidance on the use of AWS services, ensuring deployments follow the AWS Well-Architected Framework and security standards.
- Collaborate across teams: Work with development, operations (DevOps), and security teams to ensure smooth integration and deployment of cloud solutions.
- Communicate with stakeholders: Translate complex technical concepts into business terms for C-level executives, product owners, and other non-technical stakeholders.
- Optimize cloud infrastructure: Monitor resource usage and implement strategies for cost savings, performance tuning, and operational efficiency.
- Stay current on AWS: Continuously learn about new AWS services, features, and evolving best practices to improve cloud infrastructure.
Essential qualifications
- Education: A bachelor's degree in computer science, information technology, or a related field is common, but not always required.
- Technical expertise: hands-on experience with server-based and serverless AWS services (e.g., EC2, S3, Lambda, RDS, DynamoDB), networking, databases, and scripting languages like Python.
- Certifications: Professional certifications, such as the AWS Certified Solutions Architect - Associate and Professional, are highly preferred.
- Analytical skills: Ability to analyze business requirements and break down complex problems to design workable, creative solutions.
- Communication skills: Excellent written and verbal communication to effectively collaborate with various technical and non-technical stakeholders.
- Problem-solving aptitude: Strong critical thinking skills to troubleshoot issues and devise effective solutions for clients.
Career path for an AWS Architect
The role of an AWS Architect is not an entry-level position and typically requires years of hands-on experience in IT.
Senior-level and leadership (12+ years of experience)
- Senior/Principal Solutions Architect: Lead strategic initiatives for major projects, mentor junior architects, and influence high-level architectural decisions.
- Enterprise Cloud Architect: Work at a higher level to align the company's overall business strategy with its cloud technology direction.
- Management: Move into a leadership role, such as Cloud Architect Manager or Director of Cloud Solutions, overseeing a team of architects.
If anyone interested kindly share your resume on the below mentioned mail id
AWS Engineer
Posted 6 days ago
Job Viewed
Job Description
Role Developer
Required Technical Skill Set AWS & Python
Desired Experience Range: 6-10 years
Location of Requirement: Hyderabad
Desired Competencies (Technical/Behavioral Competency)
Must-Have ·
Python
· Open-source technologies and Amazon AWS
· Hosting on Amazon AWS · Skills Ø Git, Gitlab, Gitlab CI, PyCharm, Conda, Vagrant, VirtualBox Ansible, Docker, Docker Swam , DockerHub AWS S3, Cloudfront, EC2, KMS, EBS, EFS, RDS, SNS, SES, ELB, Route53 MariaDB, MongDB, InfluxDB, Elastic Search, Elastic Cloud, Mongo-Engine, sqlAlchemy Auth2, OpenIDConnect, Okta, SSO RESTful webservices
Good-to-Have
· NewRelic, Fluent
· Nginx, Flask, uwsgi, swagger, celery, pytest
Java AWS
Posted 6 days ago
Job Viewed
Job Description
Job Title: Java + AWS
Location: Hyderabad, Noida
Experience: 5-8 years
Notice Period: Only Immediate Joiner preferred
Job Description:
- Backend developer with 5- 7 years’ experience with core Java, Spring Boot, Rest API & Micro services
- Experience in developing, packaging, configuring deploying, operating, and maintaining microservices written in Java, in a cloud-native environment(AWS Cloud – EKS, EC2, S3
- Understanding of software engineering concepts and responsible for working on full life cycle engineering efforts using Agile methodologies, object-oriented design, and accepted design patterns and practices.
- Experience working with Java, Restful microservices, application development skills, and the ability to solve complex application and platform problems. Understanding on Apache, Load balancer configurations.
- Understanding on DevSecOps – CI/CD pipelines and security tools(Veracode/Blackduck/Sonar/Xray).
- Able to write stored procedures in Oracle DB, Optimize queries.
- Able to troubleshoot issues across environments. Good experience in monitoring tools, content management.
Be The First To Know
About the latest Aws Jobs in Hyderabad !
AWS Data Engineer
Posted today
Job Viewed
Job Description
**Skill:** AWS Data Engineer
**Experiences** : 6 - 13 years
**Location** : AIA Kochi
We are seeking a highly skilled Sr. Support Analyst with 6 to 13 years of experience to join our team. The ideal candidate will have expertise in AWS Lambda AWS IAM Apache Spark Python DB Performance Optimization Data Modelling Amazon S3 and TSQL. Experience in Property & Casualty Insurance is a plus. This is a hybrid role with rotational shifts and no travel required. Should have strong experience in supporting existing environments in above technologies
**Responsibilities**
+ Develop and maintain scalable applications using AWS Lambda and other AWS services.
+ Implement and manage AWS IAM policies to ensure secure access control.
+ Providing support for the already developed framework and ensuring continuity of applications during support work
+ Working with customer and on: offshore team members on resolving failures and coming up with permanent fixes.
+ Oversee the development and maintenance of data pipelines and workflows
+ Providing automation and improvement recommendations and tracking implementation to closure.
+ Preparing timeline for L3 development and tracking to closure.
+ Utilize Apache Spark for large-scale data processing and analytics.
+ Write efficient and maintainable code in Python for various applications.
+ Optimize database performance to ensure fast and reliable data retrieval.
+ Design and implement data models to support business requirements.
+ Manage and maintain data storage solutions using Amazon S3.
+ Write complex TSQL queries for data manipulation and reporting.
+ Collaborate with cross-functional teams to gather and analyze requirements.
+ Provide technical guidance and mentorship to junior developers.
+ Ensure code quality through code reviews and automated testing.
+ Troubleshoot and resolve issues in a timely manner.
+ Stay updated with the latest industry trends and technologies.
**Qualifications**
+ Must have 8 to 10 years of experience in software development.
+ Must have strong experience with AWS Lambda and AWS IAM.
+ Must have expertise in Apache Spark and Python.
+ Must have experience in DB Performance Optimization and Data Modelling.
+ Must have experience with Amazon S3 and TSQL.
+ Nice to have experience in Property & Casualty Insurance.
+ Must be able to work in a hybrid model with rotational shifts.
+ Must have excellent problem-solving and analytical skills.
+ Must have strong communication and collaboration skills.
+ Must be able to work independently and as part of a team.
+ Must be detail-oriented and able to manage multiple tasks.
+ Must be committed to continuous learning and improvement.
+ Must have a proactive and positive attitude.
**Certifications Required**
AWS Certified Developer Apache Spark Certification Python Certification
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
AWS Cloud Engineer
Posted today
Job Viewed
Job Description
At Amgen, if you feel like you're part of something bigger, it's because you are. Our shared mission-to serve patients living with serious illnesses-drives all that we do.
Since 1980, we've helped pioneer the world of biotech in our fight against the world's toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you'll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives.
Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you'll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career.
AWS Cloud Engineer
**What you will do**
The **AWS Cloud Engineer** will be responsible for maintaining scalable, secure, and reliable AWS cloud infrastructure. This is a **hands-on** engineering role requiring deep expertise in **Infrastructure as Code (IaC), automation, cloud networking, and security** . The ideal candidate should have **strong AWS knowledge** and be capable of writing and maintaining **Terraform, CloudFormation, and CI/CD pipelines** to streamline cloud deployments.
**AWS Infrastructure Design & Implementation**
+ Implement, and manage **highly available AWS cloud environments** .
+ Maintain **VPCs, Subnets, Security Groups, and IAM policies** to enforce security best practices.
+ Optimize AWS costs using **reserved instances, savings plans, and auto-scaling** .
**Infrastructure as Code (IaC) & Automation**
+ Maintain, and enhance Terraform & CloudFormation templates for cloud provisioning.
+ Automate deployment, scaling, and monitoring using AWS-native tools & scripting.
+ Implement and manage CI/CD pipelines for infrastructure and application deployments.
**Cloud Security & Compliance**
+ Enforce best practices in IAM, encryption, and network security.
+ Ensure compliance with SOC2, ISO27001, and NIST standards.
+ Implement AWS Security Hub, GuardDuty, and WAF for threat detection and response.
**Monitoring & Performance Optimization**
+ Set up AWS CloudWatch, Prometheus, Grafana, and logging solutions for proactive monitoring.
+ Implement autoscaling, load balancing, and caching strategies for performance optimization.
+ Troubleshoot cloud infrastructure issues and conduct root cause analysis.
**Collaboration & DevOps Practices**
+ Work closely with software engineers, SREs, and DevOps teams to support deployments.
+ Maintain GitOps standard processes for cloud infrastructure versioning.
+ Support on-call rotation for high-priority cloud incidents.
**What we expect of you**
We are all different, yet we all use our unique contributions to serve patients.
**Basic Qualifications:**
+ Master's degree and 1 to 3 years of computer science, IT, or related field experience OR
+ Bachelor's degree and 3 to 5 years of computer science, IT, or related field experience OR
+ Diploma and 7 to 9 years of computer science, IT, or related field experience
+ **Hands-on experience with AWS (EC2, S3, RDS, Lambda, VPC, IAM, ECS/EKS, API Gateway, etc.)** .
+ **Expertise in Terraform & CloudFormation** for AWS infrastructure automation.
+ Strong knowledge of **AWS networking (VPC, Direct Connect, Transit Gateway, VPN, Route 53)** .
+ Experience with **Linux administration, scripting (Python, Bash), and CI/CD tools (Jenkins, GitHub Actions, CodePipeline, etc.)** .
+ Troubleshooting and debugging skills in **cloud networking, storage, and security** .
**Preferred Qualifications:**
+ Experience with **Kubernetes (EKS) and service mesh architectures** .
+ Knowledge of **AWS Lambda and event-driven architectures** .
+ Familiarity with **AWS CDK, Ansible, or Packer** for cloud automation.
+ Exposure to **multi-cloud environments (Azure, GCP)** .
+ Familiarity with **HPC, DGX Cloud** .
**Professional Certifications (preferred):**
+ AWS Certified Solutions Architect - Associate or Professional
+ AWS Certified DevOps Engineer - Professional
**Soft Skills:**
+ Strong analytical and problem-solving skills.
+ Ability to work effectively with global, virtual teams
+ Effective communication and collaboration with multi-functional teams.
+ Ability to work in a fast-paced, cloud-first environment.
**Shift Information:** This position is required to be onsite and participate in 24/5 and weekend on call in rotation fashion and may require you to work a later shift. Candidates must be willing and able to work off hours, as required based on business requirements.
**What you can expect of us**
As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we'll support your journey every step of the way.
In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
**Apply now**
**for a career that defies imagination**
Objects in your future are closer than they appear. Join us.
**careers.amgen.com**
As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease.
Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
AWS DevOps Lead
Posted 3 days ago
Job Viewed
Job Description
About Cognida.ai
Our Purpose is to boost your competitive advantage using AI and Analytics.
We Deliver tangible business impact with data-driven insights powered by AI. Drive revenue growth, increase profitability and improve operational efficiencies.
We Are technologists with keen business acumen - Forever curious, always on the front lines of technological advancements. Applying our latest learnings, and tools to solve your everyday business challenges.
We Believe the power of AI should not be the exclusive preserve of the few. Every business, regardless of its size or sector deserves the opportunity to harness the power of AI to make better decisions and drive business value.
We See a world where our AI and Analytics solutions democratise decision intelligence for all businesses. With Cognida.ai, our motto is ‘No enterprise left behind’.
Qualifications
Bachelor's degree in BE/B. Tech
Minimum 15 years of experience
Position : AWS DevOps Lead/Architect
Location : Hyderabad/ Hybrid
Job Description:
- 15+ years of overall IT experience with a strong background in DevOps/SRE roles .
- Proven expertise with Terraform for Infrastructure as Code (modular, reusable, and production-grade deployments).
- Strong hands-on experience with AWS services (networking, compute, storage, databases, and security).
- Experience with multi-cloud or hybrid cloud environments.
- Solid knowledge of CI/CD pipelines with GitHub Actions (GHA) .
- Proficiency in scripting (Python, Bash, Shell, or Go) for automation.
- Experience with containerization and orchestration (Docker, Kubernetes, ECS/EKS ).
- Familiarity with observability tools (ELK, Datadog, CloudWatch ).
- Strong understanding of networking (DNS, Load Balancers, VPNs, VPCs, security groups).
- Experience implementing DevSecOps practices and integrating security checks into CI/CD.
- Exposure to configuration management tools (e.g., Ansible, Chef, Puppet).
- Knowledge of cost optimization strategies on AWS.
- Certifications (AWS Solutions Architect Professional, Terraform Associate, CKA/CKAD).
Explore AWS (