810 Data Scientists jobs in Mumbai

Data Scientists

Navi Mumbai, Maharashtra TaskUs

Posted today

Job Viewed

Tap Again To Close

Job Description

Description

Key Responsibilities

AI/ML Development & Research

• Design, develop, and deploy advanced machine learning and deep learning models for complex business problems

• Implement and optimize Large Language Models (LLMs) and Generative AI solutions

• Build agentic AI systems with autonomous decision-making capabilities

• Conduct research on emerging AI technologies and their practical applications

• Perform model evaluation, validation, and continuous improvement

Cloud Infrastructure & Full-Stack Development

• Architect and implement scalable cloud-native ML/AI solutions on AWS, Azure, or GCP

• Develop full-stack applications integrating AI models with modern web technologies

• Build and maintain ML pipelines using cloud services (SageMaker, ML Engine, etc.)

• Implement CI/CD pipelines for ML model deployment and monitoring

• Design and optimize cloud infrastructure for high-performance computing workloads

Data Engineering & Database Management

• Design and implement data pipelines for large-scale data processing

• Work with both SQL and NoSQL databases (PostgreSQL, MongoDB, Cassandra, etc.)

• Optimize database performance for ML workloads and real-time applications

• Implement data governance and quality assurance frameworks

• Handle streaming data processing and real-time analytics

Leadership & Collaboration

• Mentor junior data scientists and guide technical decision-making

• Collaborate with cross-functional teams including product, engineering, and business stakeholders

• Present findings and recommendations to technical and non-technical audiences

• Lead proof-of-concept projects and innovation initiatives

Required Qualifications

Education & Experience

• Master's or PhD in Computer Science, Data Science, Statistics, Mathematics, or related field

• 5+ years of hands-on experience in data science and machine learning

• 3+ years of experience with deep learning frameworks and neural networks

• 2+ years of experience with cloud platforms and full-stack development

Technical Skills - Core AI/ML

• Machine Learning: Scikit-learn, XGBoost, LightGBM, advanced ML algorithms

• Deep Learning: TensorFlow, PyTorch, Keras, CNN, RNN, LSTM, Transformers

• Large Language Models: GPT, BERT, T5, fine-tuning, prompt engineering

• Generative AI: Stable Diffusion, DALL-E, text-to-image, text generation

• Agentic AI: Multi-agent systems, reinforcement learning, autonomous agents

Technical Skills - Development & Infrastructure

• Programming: Python (expert), R, Java/Scala, JavaScript/TypeScript

• Cloud Platforms: AWS (SageMaker, EC2, S3, Lambda), Azure ML, or Google Cloud AI

• Databases: SQL (PostgreSQL, MySQL), NoSQL (MongoDB, Cassandra, DynamoDB)

• Full-Stack Development: React/Vue.js, Node.js, FastAPI, Flask, Docker, Kubernetes

• MLOps: MLflow, Kubeflow, Model versioning, A/B testing frameworks

• Big Data: Spark, Hadoop, Kafka, streaming data processing

Preferred Qualifications

• Experience with vector databases and embeddings (Pinecone, Weaviate, Chroma)

• Knowledge of LangChain, LlamaIndex, or similar LLM frameworks

• Experience with model compression and edge deployment

• Familiarity with distributed computing and parallel processing

• Experience with computer vision and NLP applications

• Knowledge of federated learning and privacy-preserving ML

• Experience with quantum machine learning

• Expertise in MLOps and production ML system design

Key Competencies

Technical Excellence

• Strong mathematical foundation in statistics, linear algebra, and optimization

• Ability to implement algorithms from research papers

• Experience with model interpretability and explainable AI

• Knowledge of ethical AI and bias detection/mitigation

Problem-Solving & Innovation

• Strong analytical and critical thinking skills

• Ability to translate business requirements into technical solutions

• Creative approach to solving complex, ambiguous problems

• Experience with rapid prototyping and experimentation

Communication & Leadership

• Excellent written and verbal communication skills

• Ability to explain complex technical concepts to diverse audiences

• Strong project management and organizational skills

• Experience mentoring and leading technical teams

How We Partner To Protect You: TaskUs will neither solicit money from you during your application process nor require any form of payment in order to proceed with your application. Kindly ensure that you are always in communication with only authorized recruiters of TaskUs.
DEI: In TaskUs we believe that innovation and higher performance are brought by people from all walks of life. We welcome applicants of different backgrounds, demographics, and circumstances. Inclusive and equitable practices are our responsibility as a business. TaskUs is committed to providing equal access to opportunities. If you need reasonable accommodations in any part of the hiring process, please let us know.We invite you to explore all TaskUs career opportunities and apply through the provided URL.
This advertiser has chosen not to accept applicants from your region.

Data Scientists-Associate 2

Mumbai, Maharashtra ₹800000 - ₹1200000 Y PwC Acceleration Center India

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems.

PwC US - Acceleration Center is seeking a highly skilled strong analytical background to work in our Analytics Consulting practice

Associate's will work as an integral part of business analytics teams in India alongside clients and consultants in the U.S., leading teams for high-end analytics consulting engagements and providing business recommendations to project teams.

Years of Experience:
Candidates with 2+ years of hands on experience

Must Have

  • Experience in building ML models in cloud environments (At least 1 of the 3: Azure ML, AWS SageMaker or Databricks)
  • Knowledge of predictive/prescriptive analytics, especially on usage of Log-Log, Log-Linear, Bayesian Regression technques and including Machine Learning algorithms (Supervised and Unsupervised) and deep learning algorithms and Artificial Neural Networks
  • Good knowledge of statistics For e.g: statistical tests & distributions
  • Experience in Data analysis For e.g: data cleansing, standardization and data preparation for the machine learning use cases
  • Experience in machine learning frameworks and tools (For e.g. scikit-learn, mlr, caret, H2O, TensorFlow, Pytorch, MLlib)
  • Advanced level programming in SQL or Python/Pyspark
  • Expertise with visualization tools For e.g: Tableau, PowerBI, AWS QuickSight etc.

Nice To Have

  • Working knowledge of containerization ( e.g. AWS EKS, Kubernetes), Dockers and data pipeline orchestration (e.g. Airflow)
  • Good Communication and presentation skills

Roles And Responsibilities

  • Develop and execute on project & analysis plans under the guidance of Project Manager
  • Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved
  • Drive and Conduct analysis using advanced analytics tools and coach the junior team members
  • Implement necessary quality control measures in place to ensure the deliverable integrity
  • Validate analysis outcomes, recommendations with all stakeholders including the client team
  • Build storylines and make presentations to the client team and/or PwC project leadership team
  • Contribute to the knowledge and firm building activities

Professional And Educational Background

  • Any graduate /BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master's Degree /MBA
This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Mumbai, Maharashtra ₹800000 - ₹1500000 Y Strategic HR Solutions

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Spark Scala Developer
Location : Bengaluru, Mumbai

Employment Type : Full-time

What Were Looking For
Were hiring a
Spark Scala Developer
who has real-world experience working in Big Data environments, both on-prem and/or in the cloud. You should know how to write production-grade Spark applications, fine-tune performance, and work fluently with Scalas functional style. Experience with cloud platforms and modern data tools like Snowflake or Databricks is a strong plus.

Your Responsibilities

  • Design and develop scalable data pipelines using Apache Spark and Scala
  • Optimize and troubleshoot Spark jobs for performance (e.g. memory management, shuffles, skew)
  • Work with massive datasets in on-prem Hadoop clusters or cloud platforms like AWS/GCP/Azure
  • Write clean, modular Scala code using functional programming principles
  • Collaborate with data teams to integrate with platforms like Snowflake, Databricks, or data lakes
  • Ensure code quality, documentation, and CI/CD practices are followed

Must-Have Skills

  • 3+ years of experience with Apache Spark in Scala
  • Deep understanding of Spark internalsDAG, stages, tasks, caching, joins, partitioning
  • Hands-on experience with performance tuning in production Spark jobs
  • Proficiency in Scala functional programming (e.g. immutability, higher-order functions, Option/Either)
  • Proficiency in SQL
  • Experience with any major cloud platform: AWS, Azure, or GCP

Nice-to-Have

  • Worked with Databricks, Snowflake, or Delta Lake
  • Exposure to data pipeline tools like Airflow, Kafka, Glue, or BigQuery
  • Familiarity with CI/CD pipelines and Git-based workflows
  • Comfortable with SQL optimization and schema design in distributed environments

)

This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Navi Mumbai, Maharashtra ₹800000 - ₹2400000 Y Smartavya Analytica

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Big Data Developer

Location: Navi Mumbai, India

Exp: 5+ Years

Department: Big Data and Cloud

Job Summary: Smartavya Analytica Private Limited is seeking a skilled Hadoop Developer to join our team and contribute to the development and maintenance of large-scale Big Data solutions. The ideal candidate will have extensive experience in Hadoop ecosystem technologies and a solid understanding of distributed computing, data processing, and data management.

Company: Smartavya Analytica Private limited is a niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PBs in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Big Data, Cloud and Analytics projects with super specialization in very large Data Platforms.

Empowering Your Digital Transformation with Data Modernization and AI

Requirements:

Min 3 years of experience in developing, testing & implementing Big data projects using Hadoop, Spark, Hive.

Hands-on experience playing lead role in Big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team members.

Experience in setting up Hadoop services, implementing ETL/ELT pipelines, working with Terabytes of data ingestion & processing from varied systems

Experience working in onshore/offshore model, leading technical discussions with customers, mentoring and guiding teams on technology, preparing HDD & LDD documents.

Skills:

Must to have Pyspark, Hadoop ecosystem including Hive, Sqoop, Impala, Oozie, Hue, Java, Python, SQL, bash (shell scripting)

Apache Kafka, Storm, Distributed systems, good understanding of networking, security (platform & data) concepts, Kerberos

Understanding of Data Governance concepts and experience implementing metadata capture, lineage capture, business glossary

Experience implementing CICD pipelines and working experience with tools like SCM tools such as GIT, Bit bucket, etc

Ability to assign and manage tasks for team members, provide technical guidance, work with architects on HDD, LDD, POCs.

Hands on experience in writing data ingestion pipelines, data processing pipelines using spark and sql, experience in implementing SCD type 1 & 2, auditing, exception handling mechanism

Data Warehousing projects implementation with either Java, or Scala based Hadoop programming background.

Proficient with various development methodologies like waterfall, agile/scrum.

Exceptional communication, organization, and time management skills

Collaborative approach to decision-making & Strong analytical skills

Good To Have - Certifications in any of GCP, AWS or Azure, Cloudera 12. Work on multiple Projects simultaneously, prioritizing appropriately

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Mumbai, Maharashtra ₹90000 - ₹120000 Y Infogain

Posted today

Job Viewed

Tap Again To Close

Job Description

Roles & Responsibilities

  • 3 to 5 years of experience in Data Engineering.
  • Hands-on experience with Azure data tools: ADF, Data Lake, Synapse, Databricks.
  • Strong programming skills in SQL and Python
  • Good understanding of Big Data frameworks
  • Knowledge of data modeling, warehousing, and performance tuning.
  • Familiarity with CI/CD, version control (Git), and Agile/Scrum methodologies.
  • Design, develop, and maintain ETL/ELT pipelines for large-scale data processing.
  • Work with Azure Data Services including Azure Data Factory (ADF), Azure Synapse Analytics, Data Lake, and Databricks.
  • Process and manage large datasets using Big Data tools and frameworks
  • Implement data integration, transformation, and ingestion workflows from various sources.
  • Ensure data quality, performance optimization, and pipeline reliability.
  • Collaborate with analysts, data scientists, and other engineers to deliver end-to-end data solutions.

Experience

  • 3-4.5 Years

Skills

  • Primary Skill: Data Engineering
  • Sub Skill(s): Data Engineering
  • Additional Skill(s): Data Warehouse, Big Data, Azure Datalake

About The Company
Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP).

Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.

This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Mumbai, Maharashtra ₹1200000 - ₹2400000 Y Progriz Coe

Posted today

Job Viewed

Tap Again To Close

Job Description

,On-site,,Job Title: Training & Mentorship Specialist Big Data & Cloud Technologies

Location: Onsite Lower Parel Mumbai

Employment Type: Full-time

Role Overview:

We are seeking a highly skilled Big Data & Cloud Technologies Training & Mentorship Specialist to design, deliver, and mentor teams on cutting-edge data engineering, analytics, and cloud practices. The ideal candidate will have deep technical expertise in Azure, AWS, CI/CD pipelines, ETL processes, and modern data platforms like Databricks, dbt, Snowflake, and Terraform, as well as a passion for knowledge transfer, upskilling teams, and building high-performing data talent.

Key Responsibilities:

  • Training Design & Delivery
  • Develop structured training programs, hands-on labs, and real-world project simulations for Big Data technologies and cloud platforms.
  • Deliver instructor-led sessions (both in-person and virtual) tailored for developers, data engineers, analysts, and DevOps teams.
  • Mentorship & Skill Development
  • Provide one-on-one and group mentorship to guide team members in mastering tools, frameworks, and workflows.
  • Conduct code reviews, architecture guidance, and best practices workshops.
  • Technical Enablement
  • Create learning roadmaps for Azure Data Services, AWS Data Solutions, CI/CD, and Infrastructure as Code (IaC) using Terraform.
  • Mentor teams on designing scalable ETL pipelines and optimizing workloads in Databricks, dbt, and Snowflake.

Process & Best Practices

Establish coding standards, governance models, and documentation templates for data engineering and cloud workflows.

Integrate training content with CI/CD best practices and agile delivery methods.

Required Technical Skills:

Big Data & Cloud Platforms:

Azure Data Factory, Azure Databricks, AWS Glue, AWS Lambda, EMR, Redshift.

Data Engineering & ETL:

Building and optimizing ETL/ELT workflows using Databricks, dbt, and Snowflake.

DevOps & CI/CD:

Designing and implementing automated CI/CD pipelines using tools like Azure DevOps, GitHub Actions, GitLab CI/CD, Jenkins.

Infrastructure as Code (IaC):

Proficient in Terraform for deploying and managing cloud infrastructure.

Programming & Scripting:

Python, SQL, PySpark, Shell scripting.

Version Control & Collaboration:

Git-based workflows, branching strategies, and peer review practices.

Qualifications & Experience:

Bachelors/Masters degree in Computer Science, Data Engineering, or related field (or equivalent practical experience).

7+ years of hands-on experience in Big Data Engineering and Cloud Platforms.

Proven track record in designing and delivering technical training programs.

Experience mentoring technical teams in enterprise or consulting environments.

Strong understanding of data governance, security, and compliance in cloud environments.

Soft Skills:

Excellent communication and presentation abilities.

Strong leadership and motivational skills to inspire learning and growth.

Ability to adapt training styles for different technical proficiency levels.

Problem-solving mindset with a focus on practical, real-world applications.

Why Join Us?

Opportunity to shape the next generation of Big Data & Cloud experts.

Work with cutting-edge tools and enterprise-scale data environments.

Collaborate with a dynamic and innovative technology-driven team.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Mumbai, Maharashtra ₹90000 - ₹120000 Y PwC India

Posted today

Job Viewed

Tap Again To Close

Job Description

  1. Location - Mumbai
  2. Prefered Experience Range - 7 to 10 years
  3. Must Have Technical Skills - Spark , Python , SQL , Kafka , Airflow , AWS cloud Data management services
  4. Must Have Other Skills - Minimum 3 Data Management Project Experience , Hands-on experience in bigdata space , Experience of ingesting data from various source systems , Working experience with storage file formats like Orc , Iceberg , Parquet etc. & storages like object stores , hdfs , no sql and RDBMS
  5. Nice to have but not mandatory - Databricks , snowflake exposure
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data scientists Jobs in Mumbai !

Big Data Engineer

Mumbai, Maharashtra ₹1200000 - ₹3600000 Y RiskInsight Consulting Pvt Ltd

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Responsibilities
  • Design, develop, and implement robust Big Data solutions using technologies such as Hadoop, Spark, and NoSQL databases.
  • Build and maintain scalable data pipelines for effective data ingestion, transformation, and analysis.
  • Collaborate with data scientists, analysts, and cross-functional teams to understand business requirements and translate them into technical solutions.
  • Ensure data quality and integrity through effective validation, monitoring, and troubleshooting techniques.
  • Optimize data processing workflows for maximum performance and efficiency.
  • Stay up-to-date with evolving Big Data technologies and methodologies to enhance existing systems.
  • Implement best practices for data governance, security, and compliance.
  • Document technical designs, processes, and procedures to support knowledge sharing across teams.
Requirements
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 4+ years of experience as a Big Data Engineer or in a similar role.
  • Strong proficiency in Big Data technologies (Hadoop, Spark, Hive, Pig) and frameworks.
  • Extensive experience with programming languages such as Python, Scala, or Java.
  • Knowledge of data modeling and data warehousing concepts.
  • Familiarity with NoSQL databases like Cassandra or MongoDB.
  • Proficient in SQL for data querying and analysis.
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration abilities.
  • Ability to work independently and effectively in a fast-paced environment.
Benefits

Competitive salary and benefits package.

Opportunity to work on cutting-edge technologies and solve complex challenges.

Dynamic and collaborative work environment with opportunities for growth and career advancement.

Regular training and professional development opportunities.

This advertiser has chosen not to accept applicants from your region.

Architect – Big Data

Mumbai, Maharashtra Anicalls (Pty) Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

• Experience using Python
• Robust Business Intelligence development experience
• Experience with AWS BI Services (QuickSight, EMR, Glue, etc.)
• Aurora, Spark, and MySQL Experience is a Plus
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Scientists Jobs View All Jobs in Mumbai