Big Data Engineer

Mumbai, Maharashtra ₹800000 - ₹1500000 Y Strategic HR Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Spark Scala Developer
Location : Bengaluru, Mumbai

Employment Type : Full-time

What Were Looking For
Were hiring a
Spark Scala Developer
who has real-world experience working in Big Data environments, both on-prem and/or in the cloud. You should know how to write production-grade Spark applications, fine-tune performance, and work fluently with Scalas functional style. Experience with cloud platforms and modern data tools like Snowflake or Databricks is a strong plus.

Your Responsibilities

  • Design and develop scalable data pipelines using Apache Spark and Scala
  • Optimize and troubleshoot Spark jobs for performance (e.g. memory management, shuffles, skew)
  • Work with massive datasets in on-prem Hadoop clusters or cloud platforms like AWS/GCP/Azure
  • Write clean, modular Scala code using functional programming principles
  • Collaborate with data teams to integrate with platforms like Snowflake, Databricks, or data lakes
  • Ensure code quality, documentation, and CI/CD practices are followed

Must-Have Skills

  • 3+ years of experience with Apache Spark in Scala
  • Deep understanding of Spark internalsDAG, stages, tasks, caching, joins, partitioning
  • Hands-on experience with performance tuning in production Spark jobs
  • Proficiency in Scala functional programming (e.g. immutability, higher-order functions, Option/Either)
  • Proficiency in SQL
  • Experience with any major cloud platform: AWS, Azure, or GCP

Nice-to-Have

  • Worked with Databricks, Snowflake, or Delta Lake
  • Exposure to data pipeline tools like Airflow, Kafka, Glue, or BigQuery
  • Familiarity with CI/CD pipelines and Git-based workflows
  • Comfortable with SQL optimization and schema design in distributed environments

)

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Mumbai, Maharashtra ₹90000 - ₹120000 Y Infogain

Posted today

Job Viewed

Tap Again To Close

Job Description

Roles & Responsibilities

  • 3 to 5 years of experience in Data Engineering.
  • Hands-on experience with Azure data tools: ADF, Data Lake, Synapse, Databricks.
  • Strong programming skills in SQL and Python
  • Good understanding of Big Data frameworks
  • Knowledge of data modeling, warehousing, and performance tuning.
  • Familiarity with CI/CD, version control (Git), and Agile/Scrum methodologies.
  • Design, develop, and maintain ETL/ELT pipelines for large-scale data processing.
  • Work with Azure Data Services including Azure Data Factory (ADF), Azure Synapse Analytics, Data Lake, and Databricks.
  • Process and manage large datasets using Big Data tools and frameworks
  • Implement data integration, transformation, and ingestion workflows from various sources.
  • Ensure data quality, performance optimization, and pipeline reliability.
  • Collaborate with analysts, data scientists, and other engineers to deliver end-to-end data solutions.

Experience

  • 3-4.5 Years

Skills

  • Primary Skill: Data Engineering
  • Sub Skill(s): Data Engineering
  • Additional Skill(s): Data Warehouse, Big Data, Azure Datalake

About The Company
Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP).

Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Mumbai, Maharashtra ₹90000 - ₹120000 Y PwC India

Posted today

Job Viewed

Tap Again To Close

Job Description

  1. Location - Mumbai
  2. Prefered Experience Range - 7 to 10 years
  3. Must Have Technical Skills - Spark , Python , SQL , Kafka , Airflow , AWS cloud Data management services
  4. Must Have Other Skills - Minimum 3 Data Management Project Experience , Hands-on experience in bigdata space , Experience of ingesting data from various source systems , Working experience with storage file formats like Orc , Iceberg , Parquet etc. & storages like object stores , hdfs , no sql and RDBMS
  5. Nice to have but not mandatory - Databricks , snowflake exposure
This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Mumbai, Maharashtra ₹1200000 - ₹3600000 Y RiskInsight Consulting Pvt Ltd

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Responsibilities
  • Design, develop, and implement robust Big Data solutions using technologies such as Hadoop, Spark, and NoSQL databases.
  • Build and maintain scalable data pipelines for effective data ingestion, transformation, and analysis.
  • Collaborate with data scientists, analysts, and cross-functional teams to understand business requirements and translate them into technical solutions.
  • Ensure data quality and integrity through effective validation, monitoring, and troubleshooting techniques.
  • Optimize data processing workflows for maximum performance and efficiency.
  • Stay up-to-date with evolving Big Data technologies and methodologies to enhance existing systems.
  • Implement best practices for data governance, security, and compliance.
  • Document technical designs, processes, and procedures to support knowledge sharing across teams.
Requirements
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 4+ years of experience as a Big Data Engineer or in a similar role.
  • Strong proficiency in Big Data technologies (Hadoop, Spark, Hive, Pig) and frameworks.
  • Extensive experience with programming languages such as Python, Scala, or Java.
  • Knowledge of data modeling and data warehousing concepts.
  • Familiarity with NoSQL databases like Cassandra or MongoDB.
  • Proficient in SQL for data querying and analysis.
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration abilities.
  • Ability to work independently and effectively in a fast-paced environment.
Benefits

Competitive salary and benefits package.

Opportunity to work on cutting-edge technologies and solve complex challenges.

Dynamic and collaborative work environment with opportunities for growth and career advancement.

Regular training and professional development opportunities.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Mumbai, Maharashtra RiskInsight Consulting Pvt Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

Responsibilities
  • Design, develop, and implement robust Big Data solutions using technologies such as Hadoop, Spark, and NoSQL databases.
  • Build and maintain scalable data pipelines for effective data ingestion, transformation, and analysis.
  • Collaborate with data scientists, analysts, and cross-functional teams to understand business requirements and translate them into technical solutions.
  • Ensure data quality and integrity through effective validation, monitoring, and troubleshooting techniques.
  • Optimize data processing workflows for maximum performance and efficiency.
  • Stay up-to-date with evolving Big Data technologies and methodologies to enhance existing systems.
  • Implement best practices for data governance, security, and compliance.
  • Document technical designs, processes, and procedures to support knowledge sharing across teams.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 4+ years of experience as a Big Data Engineer or in a similar role.
  • Strong proficiency in Big Data technologies (Hadoop, Spark, Hive, Pig) and frameworks.
  • Extensive experience with programming languages such as Python, Scala, or Java.
  • Knowledge of data modeling and data warehousing concepts.
  • Familiarity with NoSQL databases like Cassandra or MongoDB.
  • Proficient in SQL for data querying and analysis.
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration abilities.
  • Ability to work independently and effectively in a fast-paced environment.

Benefits

Competitive salary and benefits package.

Opportunity to work on cutting-edge technologies and solve complex challenges.

Dynamic and collaborative work environment with opportunities for growth and career advancement.

Regular training and professional development opportunities.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Mumbai, Maharashtra Hexaware Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

Description

:
  • 4 or more years of experience working directly with enterprise data solutions
  • Hands on experience working in a public cloud environment and on-prem infrastructure.
  • Specialty on Columnar Databases like Redshift Spectrum and the AWS cloud infrastructure services (Redshift, S3, Lambda)
  • Excellent SQL skills and Python coding is a must
  • Experience with a wide variety of modern data processing technologies, including
  • Expert in Commonly used AWS services (S3, Lambda, Redshift, Glue, Athena)
  • Expert in Columnar databases primarily, Redshift
  • Expertise in Python is a must have
  • Big Data Stack (Spark, spectrum)
  • Data streaming (Kafka)
  • Knowledge on time series data stores like Apache pinot or druid is a plus
  • This advertiser has chosen not to accept applicants from your region.

    Senior Big Data Engineer

    Mumbai, Maharashtra The Nielsen Company

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We enable your best to power our future.  Software Engineer  We are looking for both Senior and Software Engineers to join our Gracenote Tech team. The ideal candidates would have a passion for Clean Code, scalable architectures, Test Driven Development and DevOps.

    Job Purpose:

  • Develop and enhance our flagship Video, Audio, Automotive and Sports metadata software solutions.
  • Design applications with a Platform-first mentality where scale, consistency and reliability are at the core of every decision.
  • Job Description:

  • Evaluate, contribute to, and leverage open source technologies including Kubernetes, Trino, Spark, Airflow, Prometheus + friends, ELK, Jupyter, and more in order to create and enhance our internal SaaS/PaaS platforms.
  • Design software with a Product-owner mentality: your software is the product and as such you own the product as much as any business person does.
  • Work within small, highly configurable teams to deliver software in rapid increments using an iterative approach.
  • Develop applications to catalog and power the world’s largest entertainment metadata repository across the Video, Audio, Automotive and Sports verticals.
  • Interact with Product, Editorial and Client Experience teams to constantly refine the Gracenote offering.
  • Role Requirements:

  • Passionate about software development and DevOps
  • Competency in SQL and scripting in bash or some other scripting language
  • 5+ years of work experience with a programming language such as C++, Java, Python, Scala, Go, Pyspark, Hadoop
  • A solid grasp of computer science fundamentals: data structures, algorithms, memory management and distributed systems
  • Experience with Unix/Linux based platforms. A degree in computer science, engineering, math or related fields
  • Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.
    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Senior data engineer Jobs in Mumbai !

    Senior big data engineer

    Thane, Maharashtra Veltris

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    permanent
    Veltris is a Digital Product Engineering Services partner committed to driving technology-enabled transformation across enterprises, businesses, and industries. We specialize in delivering next-generation solutions for sectors including healthcare, technology, communications, manufacturing, and finance. With a focus on innovation and acceleration, Veltris empowers clients to build, modernize, and scale intelligent products that deliver connected, AI-powered experiences. Our experience-centric approach, agile methodologies, and exceptional talent enable us to streamline product development, maximize platform ROI, and drive meaningful business outcomes across both digital and physical ecosystems. In a strategic move to strengthen our healthcare offerings and expand industry capabilities, Veltris has acquired BPK Technologies. This acquisition enhances our domain expertise, broadens our go-to-market strategy, and positions us to deliver even greater value to enterprise and mid-market clients in healthcare and beyond. Position-Senior Big Data EngineerMust have Big Data analytics platform experience.• Key stacks: Spark, Druid, Drill, Click House.• 8+ years experience in Python/Java, CI/CD, infrastructure & cloud, Terraform, plus depth in: o Big Data pipelines: Spark, Kafka, Glue, EMR, Hudi, Schema Registry, Data Lineage. o Graph DBs: Neo4j, Neptune, Janus Graph, Dgraph.Preferred Qualifications:• Master’s degree (M. Tech/MS) or Ph. D. in Computer Science, Information Technology, Data Science, Artificial Intelligence, Machine Learning, Software Engineering, or a related technical field.• Candidates with an equivalent combination of education and relevant industry experience will also be considered.Disclaimer: The information provided herein is for general informational purposes only and reflects the current strategic direction and service offerings of Veltris. While we strive for accuracy, Veltris makes no representations or warranties regarding the completeness, reliability, or suitability of the information for any specific purpose. Any statements related to business growth, acquisitions, or future plans, including the acquisition of BPK Technologies, are subject to change without notice and do not constitute a binding commitment. Veltris reserves the right to modify its strategies, services, or business relationships at its sole discretion. For the most up-to-date and detailed information, please contact Veltris directly
    This advertiser has chosen not to accept applicants from your region.

    GCP Big Data Engineer

    Mumbai, Maharashtra Talentmatics

    Posted 4 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    We are seeking an experienced GCP Big Data Engineer with 8–10 years of expertise in designing, developing, and optimizing large-scale data processing solutions. The ideal candidate will bring strong leadership capabilities, technical depth, and a proven track record of delivering end-to-end big data solutions in cloud environments.

    Key Responsibilities:-

    • Lead and mentor teams in designing scalable and efficient ETL pipelines on Google Cloud Platform (GCP) .
    • Drive best practices for data modeling, data integration, and data quality management .
    • Collaborate with stakeholders to define data engineering strategies aligned with business goals.
    • Ensure high performance, scalability, and reliability in data systems using SQL and PySpark .

    Must-Have Skills:-

    • GCP expertise in data engineering services (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage).
    • Strong programming in SQL & PySpark .
    • Hands-on experience in ETL pipeline design, development, and optimization .
    • Strong problem-solving and leadership skills with experience guiding data engineering teams.

    Qualification:-

    • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field .
    • Relevant certifications in GCP Data Engineering preferred.
    This advertiser has chosen not to accept applicants from your region.

    GCP Big Data Engineer

    Thane, Maharashtra Talentmatics

    Posted 4 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    We are seeking an experienced GCP Big Data Engineer with 8–10 years of expertise in designing, developing, and optimizing large-scale data processing solutions. The ideal candidate will bring strong leadership capabilities, technical depth, and a proven track record of delivering end-to-end big data solutions in cloud environments.

    Key Responsibilities:-

    • Lead and mentor teams in designing scalable and efficient ETL pipelines on Google Cloud Platform (GCP) .
    • Drive best practices for data modeling, data integration, and data quality management .
    • Collaborate with stakeholders to define data engineering strategies aligned with business goals.
    • Ensure high performance, scalability, and reliability in data systems using SQL and PySpark .

    Must-Have Skills:-

    • GCP expertise in data engineering services (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage).
    • Strong programming in SQL & PySpark .
    • Hands-on experience in ETL pipeline design, development, and optimization .
    • Strong problem-solving and leadership skills with experience guiding data engineering teams.

    Qualification:-

    • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field .
    • Relevant certifications in GCP Data Engineering preferred.
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Senior Data Engineer Jobs View All Jobs in Mumbai