6,296 Python Professionals jobs in India
Python with Data science
Posted 3 days ago
Job Viewed
Job Description
Role: Data Science Engineer
Required Technical Skill Set: Data Science & AI lifecycle Mgmt Tools: NumPy, pandas, TensorFlow, Minio, Pachyderm, Kubeflow, Jupyter Python, Java (SpringBoot), C
Desired Experience Range: 6+ years
Location of Requirement: Pan India
Desired Competencies (Technical/Behavioral Competency)
Must-Have
· 4+ years of experience in Data Science, ML & DL.
· Well versed with building Algorithms to predict Network Performance degradation.
· Expertise in Acumos AI and MLFlow which are platforms to manage and package ML lifecycle.
· Microservices, Cloud native technologies, Serverless Computing
· Cloud computing, SDN, NFV
· Docker, Kubernetes, Redhat Openshift, Kubeflow
· Strong development skills including Linux, Shell, Python, Java, Scala
Good-to-Have
· Exposure to GCP, AWS and Azure services.
· Exposure to Cloud Native Tools: Docker,Kubernetes, Jaeger, Fluentd, Elastic Search, Kibana, Istio, Harbor, Prometheus, Grafana, Ansible, Git, Jenkins, Gerrit, Knative
Role descriptions / Expectations from the Role
1 Data Engineer. Hands-on software development – Strong hands on experience in designing and developing mission critical, highly complex applications
· Development of Multi-Processor Performance Benchmarking (MPPB),a solution to optimize performance of various workloads (AI, NFV, Database, and Webservers) by tuning the processor configurations.
· Development of Microservices Development and Deployment Platform(MDDP) a platform that aids in the development and deployment of microservices with Kubernetes, OpenShift as underlying orchestrators.
· Development of Ascent, a framework for upgrading microservice based applications in targeted regions using Istio and Knative.
· Develop machine learning model to predict the Network element (Switches or Routers) failure in near future from syslogs and network performance metrics data.
· Develop machine learning model for a client which needs to predict the downlink throughput in telecom. using the TensorFlow package using python. · Develop an algorithm for radio network optimization, which suggests the radio network engineer to change the configuration of the base station. Based on learned patterns from the several base stations in the telecommunications.
· Explored around Acumos AI and MLFlow which are platforms to manage and package ML lifecycle
Python (Programming Language)
Posted 6 days ago
Job Viewed
Job Description
Company Overview
Hour4u is a dynamic marketplace designed exclusively for gig workers to discover and engage in nearby job opportunities. By empowering local businesses with the ability to hire on-demand temporary staff with a single click, Hour4u creates a win-win scenario. Both local businesses and gig workers benefit, as businesses can hire according to their needs and gig workers can work on their terms. Based in Pune, Maharashtra, and part of the Human Resources Services industry, Hour4u focuses on innovation in staffing solutions.
Job Overview
We are seeking an experienced Python Programming professional for a full-time, mid-level position located in Delhi. The ideal candidate will possess 4 to 6 years of work experience in Python programming and will be proficient in essential programming concepts. This role is integral to developing and maintaining software systems in collaboration with a dynamic team, following agile methodologies.
Qualifications and Skills
- Demonstrated expertise in Python with at least 4 years of hands-on experience in a professional setting.
- Strong understanding and application of object-oriented programming principles and paradigms.
- Thorough knowledge of data structures and algorithms and their effective implementation.
- Proficiency in agile methodologies, including understanding and participating actively in all phases of the agile process.
- Must have skills in writing and maintaining unit tests to ensure software quality and performance.
- Ability to collaborate effectively in a team, communicating complex technical concepts clearly and precisely.
- Experience with software version control systems like Git for seamless team collaboration.
- Proven problem-solving skills with a logical approach to solving software-related issues.
Roles and Responsibilities
- Design, develop, and maintain scalable and efficient software applications using Python.
- Translate user requirements into functional specifications and code implementations.
- Collaborate closely with cross-functional teams to define, design, and ship new features.
- Ensure code quality and maintainability through comprehensive unit testing and peer reviews.
- Participate actively in agile development cycles and contribute to sprint planning and retrospectives.
- Identify and fix software bugs promptly and enhance software performance.
- Stay updated with the latest industry trends and technologies to ensure competitive edge in solutions.
- Mentor and guide junior developers, fostering a collaborative and innovative team environment.
Python (Programming Language)
Posted 21 days ago
Job Viewed
Job Description
Greetings from ALIQAN Technologies!
We are hiring a Python Developer for one of our clients.
Job Title: Python Developer
Experience: 5+ Years
Job Type: 6 Months Contract + ext
Location: Bangalore (Hybrid)
Notice Period: Immediate Joiner Only
Job Description:
- Python development, backend experience.
- Strong knowledge of AWS services (Glue, Lambda, DynamoDB, S3, PySpark).
- Excellent debugging skills to resolve production issues.
- Experience with MySQL, NoSQL databases.
Optional Skills:
- Experience with Django and CRON jobs.
- Familiarity with data lakes, big data tools, and CI/CD.
Remote Senior Python Developer - Data Science
Posted 10 days ago
Job Viewed
Job Description
Key Responsibilities:
- Design, develop, and deploy scalable data pipelines and ETL processes using Python.
- Build, train, and optimize machine learning models for various applications.
- Develop and maintain APIs for data access and model serving.
- Collaborate with data scientists to implement and productionize statistical models and algorithms.
- Optimize data infrastructure and code for performance, scalability, and reliability.
- Implement data warehousing solutions and database management strategies.
- Ensure data quality, integrity, and security across all systems.
- Write clean, well-documented, and maintainable code.
- Participate in code reviews and contribute to architectural design discussions.
- Stay current with advancements in Python, data science, machine learning, and cloud technologies.
- Mentor junior engineers and share best practices in software development and data engineering.
- Master's or Ph.D. in Computer Science, Data Science, Statistics, or a related quantitative field, or equivalent practical experience.
- 5+ years of professional experience in Python development, with a strong focus on data-intensive applications.
- Extensive experience with data science libraries such as Pandas, NumPy, Scikit-learn, TensorFlow, or PyTorch.
- Proficiency in SQL and experience with various database systems (e.g., PostgreSQL, MySQL, NoSQL).
- Experience with cloud platforms (AWS, Azure, GCP) and related data services.
- Knowledge of big data technologies (e.g., Spark, Hadoop) is a plus.
- Familiarity with containerization (Docker) and orchestration (Kubernetes).
- Excellent problem-solving skills and the ability to work effectively in a remote, agile environment.
- Strong communication and collaboration abilities.
Senior Analyst - Data Science (Python Developer) -Chennai / Hyderabad
Posted 644 days ago
Job Viewed
Job Description
Python
Posted 1 day ago
Job Viewed
Job Description
Join our dynamic team as a Sr. Developer with 6 - 8.5 years where you will leverage your expertise in SQL Scripting Python Snowflake SQL and AWS to drive impactful projects. With a hybrid work model you will collaborate with cross-functional teams to design and implement innovative solutions. Your contributions will enhance our data-driven decision-making processes and support our mission to deliver exceptional services.
**Responsibilities**
+ Develop and maintain robust SQL scripts to optimize database performance and ensure data integrity.
+ Collaborate with data engineers to design and implement scalable Snowflake SQL solutions that meet business requirements.
+ Utilize Python to automate data processing tasks and streamline workflows for increased efficiency.
+ Work closely with AWS services to deploy and manage cloud-based applications ensuring high availability and security.
+ Analyze complex datasets to provide actionable insights that drive strategic decision-making.
+ Participate in code reviews to maintain high coding standards and improve team collaboration.
+ Troubleshoot and resolve technical issues to minimize downtime and enhance system reliability.
+ Document technical specifications and create user guides to facilitate knowledge sharing across teams.
+ Engage with stakeholders to gather requirements and translate them into technical solutions.
+ Stay updated with the latest industry trends and technologies to continuously improve development practices.
+ Contribute to the design and architecture of data solutions that align with organizational goals.
+ Mentor junior developers by sharing knowledge and best practices to foster a culture of continuous learning.
+ Ensure compliance with data governance and security policies to protect sensitive information.
**Qualifications**
+ Possess strong proficiency in SQL Scripting and Snowflake SQL to manage and manipulate large datasets effectively.
+ Demonstrate expertise in Python for developing efficient and scalable data processing applications.
+ Have hands-on experience with AWS services to deploy and manage cloud-based solutions.
+ Exhibit excellent problem-solving skills and the ability to work collaboratively in a hybrid work environment.
+ Show a track record of delivering high-quality software solutions within specified timelines.
+ Display strong communication skills to interact with technical and non-technical stakeholders.
+ Be adaptable to changing project requirements and capable of managing multiple tasks simultaneously.
**Certifications Required**
AWS Certified Solutions Architect Snowflake SnowPro Core Certification
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
Python
Posted 1 day ago
Job Viewed
Job Description
We are seeking a highly skilled Sr. Developer with 4 to 11 years of experience to join our dynamic team. The ideal candidate will have expertise in Spark in Scala Databricks Workspace Admin Python Databricks SQL Databricks Workflows and PySpark. This role involves working in a hybrid model with day shifts and no travel requirements. The Sr. Developer will play a crucial role in developing and maintaining our data infrastructure ensuring efficient data processing and analysis.
**Responsibilities**
+ Develop and maintain scalable data pipelines using Spark in Scala and PySpark.
+ Oversee the administration of Databricks Workspace to ensure optimal performance and security.
+ Provide expertise in Python programming to develop and automate data processing tasks.
+ Utilize Databricks SQL to perform complex queries and data analysis.
+ Implement and manage Databricks Workflows to streamline data processing and integration.
+ Collaborate with data engineers and analysts to understand data requirements and deliver solutions.
+ Ensure data quality and integrity by implementing best practices and performing regular audits.
+ Optimize data processing performance by fine-tuning Spark and Databricks configurations.
+ Troubleshoot and resolve issues related to data pipelines and Databricks environment.
+ Document data processing workflows and maintain comprehensive technical documentation.
+ Stay updated with the latest advancements in data engineering and Databricks technologies.
+ Contribute to the continuous improvement of data infrastructure and processes.
+ Support the team in achieving project milestones and deliverables on time.
**Qualifications**
+ Possess strong expertise in Spark in Scala and PySpark for developing data pipelines.
+ Have extensive experience in administering Databricks Workspace for optimal performance.
+ Demonstrate proficiency in Python programming for data processing and automation.
+ Show advanced skills in Databricks SQL for complex data queries and analysis.
+ Have experience in implementing and managing Databricks Workflows.
+ Exhibit strong problem-solving skills and the ability to troubleshoot data pipeline issues.
+ Possess excellent communication skills to collaborate effectively with team members.
+ Have a keen eye for detail and a commitment to ensuring data quality and integrity.
+ Stay proactive in learning and adapting to new data engineering technologies.
+ Demonstrate the ability to work independently and manage multiple tasks efficiently.
+ Show a strong understanding of data processing best practices and optimization techniques.
+ Have a proven track record of delivering high-quality data solutions in a timely manner.
+ Exhibit a passion for leveraging data to drive business insights and impact.
**Certifications Required**
Databricks Certified Data Engineer Associate Apache Spark Developer Certification
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
Be The First To Know
About the latest Python professionals Jobs in India !
Hiring || Python Programming Professionals - Bangalore
Posted 10 days ago
Job Viewed
Job Description
Trainee Intern Data Science
Posted 15 days ago
Job Viewed
Job Description
Company Overview – WhatJobs Ltd
WhatJobs is a global job search engine and career platform operating in over 50 countries. We leverage advanced technology and AI-driven tools to connect millions of job seekers with opportunities, helping businesses and individuals achieve their goals.
Position: Data Science Trainee/Intern
Location: Commercial Street
Duration: 3 Months
Type: Internship/Traineeship (with potential for full-time opportunities)
Role Overview
We are looking for enthusiastic Data Science trainees/interns eager to explore the world of data analytics, machine learning, and business insights. You will work on real-world datasets, apply statistical and computational techniques, and contribute to data-driven decision-making at WhatJobs.
Key Responsibilities
- Collect, clean, and analyze datasets to derive meaningful insights.
- Assist in building and evaluating machine learning models.
- Work with visualization tools to present analytical results.
- Support the team in developing data pipelines and automation scripts.
- Research new tools, techniques, and best practices in data science.
Requirements
- Basic knowledge of Python and data science libraries (Pandas, NumPy, Matplotlib, Scikit-learn).
- Understanding of statistics, probability, and data analysis techniques.
- Familiarity with machine learning concepts.
- Knowledge of Google Data Studio and BigQuery for reporting and data management.
- Strong analytical skills and eagerness to learn.
- Good communication and teamwork abilities.
What We Offer
- Hands-on experience with real-world data science projects.
- Guidance and mentorship from experienced data professionals.
- Opportunity to work with a global technology platform.
- Certificate of completion and potential for full-time role.
Company Details
Data Science
Posted 3 days ago
Job Viewed
Job Description
Experience: 6-8years
Looking for Senior Data Science Developers (immediate joiners).
Key Responsibilities:
- Analyze large, complex datasets to identify trends, patterns, and opportunities.
- Develop and deploy machine learning models to solve business challenges.
- Communicate findings through data visualizations and reports.
- Collaborate with data engineers, analysts, and product teams to turn insights into action.
- Design and implement A/B tests to evaluate feature performance or business hypotheses.
- Translate business problems to data science problem
- Work with stakeholders to understand data needs and deliver custom solutions.
Must have Skills:
- Proficiency in Python for data analysis and modeling.
- Strong knowledge of SQL and working with relational databases.
- Experience with machine learning algorithms and libraries (e.g., scikit-learn, TensorFlow, XGBoost etc).
- Experience in Forecasting will be added advantage
- Solid understanding of statistics , probability , and hypothesis testing .
- Knowledge of deep learning frameworks (e.g., PyTorch, TensorFlow).
- Working knowledge with cloud platforms like AWS, Databricks (Databricks preferred)
- Experience in MLOP's and CI/CD work flow
- Expeirence in developing end to end pipeline in any of cloud platform (AWS and Databricks)