557 Data Scientists jobs in Noida

Big Data Engineer

Noida, Uttar Pradesh Training Basket

Posted today

Job Viewed

Tap Again To Close

Job Description

We are looking for passionate   B.Tech freshers   with strong programming skills in   Java   who are eager to start their career in   Big Data technologies . The role offers exciting opportunities to work on real-time big data projects, data pipelines, and cloud-based data solutions.


Requirements
  • Assist in designing, developing, and maintaining   big data solutions .

  • Write efficient code in   Java   and integrate with big data frameworks.

  • Support in building   data ingestion, transformation, and processing pipelines .

  • Work with   distributed systems   and learn technologies like   Hadoop, Spark, Kafka, Hive, HBase .

  • Collaborate with senior engineers on data-related problem-solving and performance optimization.

  • Participate in   debugging, testing, and documentation   of big data workflows.

Required Skills:
  • Strong knowledge of   Core Java & OOPs concepts .

  • Good understanding of   SQL and database concepts .

  • Familiarity with   data structures & algorithms .

  • Basic knowledge of   Big Data frameworks   (Hadoop/Spark/Kafka) is an added advantage.

  • Problem-solving skills and eagerness to learn new technologies.

Eligibility Criteria:
  • Education:   B.Tech (CSE/IT or related fields).

  • Batch:   (specific, e.g., 2024/2025 pass outs).

  • Experience:   Fresher (0–1 year)



Benefits
  • Training and mentoring in   cutting-edge Big Data tools & technologies .

  • Exposure to   live projects   from day one.

  • A fast-paced, learning-oriented work culture.



This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Noida, Uttar Pradesh Training Basket

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

We are looking for passionate   B.Tech freshers   with strong programming skills in   Java   who are eager to start their career in   Big Data technologies . The role offers exciting opportunities to work on real-time big data projects, data pipelines, and cloud-based data solutions.


Requirements
  • Assist in designing, developing, and maintaining   big data solutions .

  • Write efficient code in   Java   and integrate with big data frameworks.

  • Support in building   data ingestion, transformation, and processing pipelines .

  • Work with   distributed systems   and learn technologies like   Hadoop, Spark, Kafka, Hive, HBase .

  • Collaborate with senior engineers on data-related problem-solving and performance optimization.

  • Participate in   debugging, testing, and documentation   of big data workflows.

Required Skills:
  • Strong knowledge of   Core Java & OOPs concepts .

  • Good understanding of   SQL and database concepts .

  • Familiarity with   data structures & algorithms .

  • Basic knowledge of   Big Data frameworks   (Hadoop/Spark/Kafka) is an added advantage.

  • Problem-solving skills and eagerness to learn new technologies.

Eligibility Criteria:
  • Education:   B.Tech (CSE/IT or related fields).

  • Batch:   (specific, e.g., 2024/2025 pass outs).

  • Experience:   Fresher (0–1 year)



Benefits
  • Training and mentoring in   cutting-edge Big Data tools & technologies .

  • Exposure to   live projects   from day one.

  • A fast-paced, learning-oriented work culture.




Requirements
Strong knowledge of Core Java & OOPs concepts. Good understanding of SQL and database concepts. Familiarity with data structures & algorithms.
This advertiser has chosen not to accept applicants from your region.

Big Data Administrator

Noida, Uttar Pradesh Anicalls (Pty) Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

experience in administration of Hadoop Big Data tools
• experience working on batch processing and tools in the Hadoop technical stack (e.g., MapReduce, Yarn, Hive, HDFS, Oozie)
• The candidate must have experience in Ambari setup and management
• 1 to 2 years of MapRcluster management/administration ·
• 2+ years of administration experience working with tools in the stream processing technical stack (e.g., Kudu, Spark, Kafka, Avro)
• Hadoop Administration experience with NoSQL stores (especially - HBase)
• Hands-on experience monitoring and reporting on Hadoop resource utilization and troubleshoot
• Hands-on experience supporting code deployments (Spark, Hive, Ab Initio, etc.) into the Hadoop cluster
• 3+ years as a systems integrator with Linux (SUSE, Ubuntu) systems and shell scripting
• 2+ years of DevOps (Dockers, Ansible, Kubernetes, Mesos) tool administration
• Certification in MapR and Linux administration highly preferred - Cloudera certification preferred.
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

New Delhi, Delhi Ravant Media

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled Big Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines that handle high-volume financial data, including stocks, cryptocurrencies, and third-party data sources. You will play a critical role in ensuring data integrity, scalability, and real-time availability across our platforms.


Key Responsibilities :-

  • Design, develop, and manage end-to-end data pipelines for stocks, crypto, and other financial datasets.
  • Integrate third-party APIs and data feeds into internal systems.
  • Build and optimize data ingestion, storage, and transformation workflows (batch and real-time).
  • Ensure data quality, consistency, and reliability across all pipelines.
  • Collaborate with data scientists, analysts, and backend engineers to provide clean, structured, and scalable datasets.
  • Monitor, troubleshoot, and optimize pipeline performance.
  • Implement ETL/ELT best practices, data governance, and security protocols.
  • Contribute to the scalability and automation of our data infrastructure.


Requirements :-

  • Proven experience as a Big Data Engineer / Data Engineer (preferably in financial or crypto domains).
  • Strong expertise in Python, SQL, and distributed data systems.
  • Hands-on experience with data pipeline tools (e.g., Apache Spark, Kafka, Airflow, Flink, Prefect).
  • Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift, etc.).
  • Knowledge of API integrations and handling real-time streaming data.
  • Familiarity with databases (relational and NoSQL) and data modeling.
  • Solid understanding of stocks, cryptocurrencies, and financial data structures (preferred).
  • Strong problem-solving skills with the ability to handle large-scale data challenges.
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Faridabad, Haryana Ravant Media

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled Big Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines that handle high-volume financial data, including stocks, cryptocurrencies, and third-party data sources. You will play a critical role in ensuring data integrity, scalability, and real-time availability across our platforms.


Key Responsibilities :-

  • Design, develop, and manage end-to-end data pipelines for stocks, crypto, and other financial datasets.
  • Integrate third-party APIs and data feeds into internal systems.
  • Build and optimize data ingestion, storage, and transformation workflows (batch and real-time).
  • Ensure data quality, consistency, and reliability across all pipelines.
  • Collaborate with data scientists, analysts, and backend engineers to provide clean, structured, and scalable datasets.
  • Monitor, troubleshoot, and optimize pipeline performance.
  • Implement ETL/ELT best practices, data governance, and security protocols.
  • Contribute to the scalability and automation of our data infrastructure.


Requirements :-

  • Proven experience as a Big Data Engineer / Data Engineer (preferably in financial or crypto domains).
  • Strong expertise in Python, SQL, and distributed data systems.
  • Hands-on experience with data pipeline tools (e.g., Apache Spark, Kafka, Airflow, Flink, Prefect).
  • Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift, etc.).
  • Knowledge of API integrations and handling real-time streaming data.
  • Familiarity with databases (relational and NoSQL) and data modeling.
  • Solid understanding of stocks, cryptocurrencies, and financial data structures (preferred).
  • Strong problem-solving skills with the ability to handle large-scale data challenges.
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Delhi, Delhi Ravant Media

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled Big Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines that handle high-volume financial data, including stocks, cryptocurrencies, and third-party data sources. You will play a critical role in ensuring data integrity, scalability, and real-time availability across our platforms.


Key Responsibilities :-

  • Design, develop, and manage end-to-end data pipelines for stocks, crypto, and other financial datasets.
  • Integrate third-party APIs and data feeds into internal systems.
  • Build and optimize data ingestion, storage, and transformation workflows (batch and real-time).
  • Ensure data quality, consistency, and reliability across all pipelines.
  • Collaborate with data scientists, analysts, and backend engineers to provide clean, structured, and scalable datasets.
  • Monitor, troubleshoot, and optimize pipeline performance.
  • Implement ETL/ELT best practices, data governance, and security protocols.
  • Contribute to the scalability and automation of our data infrastructure.


Requirements :-

  • Proven experience as a Big Data Engineer / Data Engineer (preferably in financial or crypto domains).
  • Strong expertise in Python, SQL, and distributed data systems.
  • Hands-on experience with data pipeline tools (e.g., Apache Spark, Kafka, Airflow, Flink, Prefect).
  • Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift, etc.).
  • Knowledge of API integrations and handling real-time streaming data.
  • Familiarity with databases (relational and NoSQL) and data modeling.
  • Solid understanding of stocks, cryptocurrencies, and financial data structures (preferred).
  • Strong problem-solving skills with the ability to handle large-scale data challenges.
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Noida, Uttar Pradesh Ravant Media

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled Big Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines that handle high-volume financial data, including stocks, cryptocurrencies, and third-party data sources. You will play a critical role in ensuring data integrity, scalability, and real-time availability across our platforms.


Key Responsibilities :-

  • Design, develop, and manage end-to-end data pipelines for stocks, crypto, and other financial datasets.
  • Integrate third-party APIs and data feeds into internal systems.
  • Build and optimize data ingestion, storage, and transformation workflows (batch and real-time).
  • Ensure data quality, consistency, and reliability across all pipelines.
  • Collaborate with data scientists, analysts, and backend engineers to provide clean, structured, and scalable datasets.
  • Monitor, troubleshoot, and optimize pipeline performance.
  • Implement ETL/ELT best practices, data governance, and security protocols.
  • Contribute to the scalability and automation of our data infrastructure.


Requirements :-

  • Proven experience as a Big Data Engineer / Data Engineer (preferably in financial or crypto domains).
  • Strong expertise in Python, SQL, and distributed data systems.
  • Hands-on experience with data pipeline tools (e.g., Apache Spark, Kafka, Airflow, Flink, Prefect).
  • Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift, etc.).
  • Knowledge of API integrations and handling real-time streaming data.
  • Familiarity with databases (relational and NoSQL) and data modeling.
  • Solid understanding of stocks, cryptocurrencies, and financial data structures (preferred).
  • Strong problem-solving skills with the ability to handle large-scale data challenges.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data scientists Jobs in Noida !

Big Data Developer

Ghaziabad, Uttar Pradesh Ravant Media

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled Big Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines that handle high-volume financial data, including stocks, cryptocurrencies, and third-party data sources. You will play a critical role in ensuring data integrity, scalability, and real-time availability across our platforms.


Key Responsibilities :-

  • Design, develop, and manage end-to-end data pipelines for stocks, crypto, and other financial datasets.
  • Integrate third-party APIs and data feeds into internal systems.
  • Build and optimize data ingestion, storage, and transformation workflows (batch and real-time).
  • Ensure data quality, consistency, and reliability across all pipelines.
  • Collaborate with data scientists, analysts, and backend engineers to provide clean, structured, and scalable datasets.
  • Monitor, troubleshoot, and optimize pipeline performance.
  • Implement ETL/ELT best practices, data governance, and security protocols.
  • Contribute to the scalability and automation of our data infrastructure.


Requirements :-

  • Proven experience as a Big Data Engineer / Data Engineer (preferably in financial or crypto domains).
  • Strong expertise in Python, SQL, and distributed data systems.
  • Hands-on experience with data pipeline tools (e.g., Apache Spark, Kafka, Airflow, Flink, Prefect).
  • Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift, etc.).
  • Knowledge of API integrations and handling real-time streaming data.
  • Familiarity with databases (relational and NoSQL) and data modeling.
  • Solid understanding of stocks, cryptocurrencies, and financial data structures (preferred).
  • Strong problem-solving skills with the ability to handle large-scale data challenges.
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

New Delhi, Delhi Brillio

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

Faridabad, Haryana Brillio

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Scientists Jobs View All Jobs in Noida