Data Scientists

Gurgaon, Haryana Think Future Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

**Exp**: 3+ Years

**Location**: Gurgaon/Remote

**Skill-sets**:Python | Deep Learning | ML | Tens or Flow | Pytorch | Image Processing | Model Architecture
- 3+ years of experience in the field of Data Science
- Excellent knowledge of SQL
- Good Knowledge of Python
- Working experience in Data Science project life cycles from use case framing, data collection, data exploration, model building, deployment etc
- Hands on experience wit Deep Learning libraries such as TensorFlow and Pytorch
This advertiser has chosen not to accept applicants from your region.

Big Data Administrator

Noida, Uttar Pradesh Anicalls (Pty) Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

experience in administration of Hadoop Big Data tools
• experience working on batch processing and tools in the Hadoop technical stack (e.g., MapReduce, Yarn, Hive, HDFS, Oozie)
• The candidate must have experience in Ambari setup and management
• 1 to 2 years of MapRcluster management/administration ·
• 2+ years of administration experience working with tools in the stream processing technical stack (e.g., Kudu, Spark, Kafka, Avro)
• Hadoop Administration experience with NoSQL stores (especially - HBase)
• Hands-on experience monitoring and reporting on Hadoop resource utilization and troubleshoot
• Hands-on experience supporting code deployments (Spark, Hive, Ab Initio, etc.) into the Hadoop cluster
• 3+ years as a systems integrator with Linux (SUSE, Ubuntu) systems and shell scripting
• 2+ years of DevOps (Dockers, Ansible, Kubernetes, Mesos) tool administration
• Certification in MapR and Linux administration highly preferred - Cloudera certification preferred.
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

Delhi, Delhi Brillio

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

Noida, Uttar Pradesh Brillio

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

Ghaziabad, Uttar Pradesh Brillio

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

Gurgaon, Haryana Brillio

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

New Delhi, Delhi Brillio

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data scientists Jobs in New Delhi !

Big Data Specialist

Faridabad, Haryana Brillio

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Scientists Jobs View All Jobs in New Delhi