239 Data Scientists jobs in Kochi

Big Data Developer

Kochi, Kerala Ravant Media

Posted today

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled Big Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines that handle high-volume financial data, including stocks, cryptocurrencies, and third-party data sources. You will play a critical role in ensuring data integrity, scalability, and real-time availability across our platforms.


Key Responsibilities :-

  • Design, develop, and manage end-to-end data pipelines for stocks, crypto, and other financial datasets.
  • Integrate third-party APIs and data feeds into internal systems.
  • Build and optimize data ingestion, storage, and transformation workflows (batch and real-time).
  • Ensure data quality, consistency, and reliability across all pipelines.
  • Collaborate with data scientists, analysts, and backend engineers to provide clean, structured, and scalable datasets.
  • Monitor, troubleshoot, and optimize pipeline performance.
  • Implement ETL/ELT best practices, data governance, and security protocols.
  • Contribute to the scalability and automation of our data infrastructure.


Requirements :-

  • Proven experience as a Big Data Engineer / Data Engineer (preferably in financial or crypto domains).
  • Strong expertise in Python, SQL, and distributed data systems.
  • Hands-on experience with data pipeline tools (e.g., Apache Spark, Kafka, Airflow, Flink, Prefect).
  • Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift, etc.).
  • Knowledge of API integrations and handling real-time streaming data.
  • Familiarity with databases (relational and NoSQL) and data modeling.
  • Solid understanding of stocks, cryptocurrencies, and financial data structures (preferred).
  • Strong problem-solving skills with the ability to handle large-scale data challenges.
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Ernakulam, Kerala Ravant Media

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled Big Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines that handle high-volume financial data, including stocks, cryptocurrencies, and third-party data sources. You will play a critical role in ensuring data integrity, scalability, and real-time availability across our platforms.


Key Responsibilities :-

  • Design, develop, and manage end-to-end data pipelines for stocks, crypto, and other financial datasets.
  • Integrate third-party APIs and data feeds into internal systems.
  • Build and optimize data ingestion, storage, and transformation workflows (batch and real-time).
  • Ensure data quality, consistency, and reliability across all pipelines.
  • Collaborate with data scientists, analysts, and backend engineers to provide clean, structured, and scalable datasets.
  • Monitor, troubleshoot, and optimize pipeline performance.
  • Implement ETL/ELT best practices, data governance, and security protocols.
  • Contribute to the scalability and automation of our data infrastructure.


Requirements :-

  • Proven experience as a Big Data Engineer / Data Engineer (preferably in financial or crypto domains).
  • Strong expertise in Python, SQL, and distributed data systems.
  • Hands-on experience with data pipeline tools (e.g., Apache Spark, Kafka, Airflow, Flink, Prefect).
  • Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift, etc.).
  • Knowledge of API integrations and handling real-time streaming data.
  • Familiarity with databases (relational and NoSQL) and data modeling.
  • Solid understanding of stocks, cryptocurrencies, and financial data structures (preferred).
  • Strong problem-solving skills with the ability to handle large-scale data challenges.
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Kochi, Kerala Ravant Media

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled Big Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines that handle high-volume financial data, including stocks, cryptocurrencies, and third-party data sources. You will play a critical role in ensuring data integrity, scalability, and real-time availability across our platforms.


Key Responsibilities :-

  • Design, develop, and manage end-to-end data pipelines for stocks, crypto, and other financial datasets.
  • Integrate third-party APIs and data feeds into internal systems.
  • Build and optimize data ingestion, storage, and transformation workflows (batch and real-time).
  • Ensure data quality, consistency, and reliability across all pipelines.
  • Collaborate with data scientists, analysts, and backend engineers to provide clean, structured, and scalable datasets.
  • Monitor, troubleshoot, and optimize pipeline performance.
  • Implement ETL/ELT best practices, data governance, and security protocols.
  • Contribute to the scalability and automation of our data infrastructure.


Requirements :-

  • Proven experience as a Big Data Engineer / Data Engineer (preferably in financial or crypto domains).
  • Strong expertise in Python, SQL, and distributed data systems.
  • Hands-on experience with data pipeline tools (e.g., Apache Spark, Kafka, Airflow, Flink, Prefect).
  • Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift, etc.).
  • Knowledge of API integrations and handling real-time streaming data.
  • Familiarity with databases (relational and NoSQL) and data modeling.
  • Solid understanding of stocks, cryptocurrencies, and financial data structures (preferred).
  • Strong problem-solving skills with the ability to handle large-scale data challenges.
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

Ernakulam, Kerala Brillio

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.

Big Data Specialist

Kochi, Kerala Brillio

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
  • Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
  • Write efficient, high-performance SQL queries to extract, transform, and load data.
  • Develop reusable data frameworks and utilities in Python.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
  • Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.


Must-Have Skills

  • Strong hands-on experience with Hive and SQL for querying and data transformation.
  • Proficiency in Python for data manipulation and automation.
  • Expertise in Apache Spark (batch and streaming).
  • Experience working with Kafka for streaming data pipelines.


Good-to-Have Skills

  • Experience with workflow orchestration tools (Airflow etc.)
  • Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
  • Familiarity with CI/CD pipelines and version control (Git).
This advertiser has chosen not to accept applicants from your region.

Big Data Engineer - Scala

Kochi, Kerala Idyllic Services

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Big Data Engineer – Scala

Location: Bangalore, Chennai, Gurgaon, Pune, Mumbai.

Experience: 7–10 Years (Minimum 3+ years in Scala)

Notice Period: Immediate to 30 Days

Mode of Work: Hybrid


Click the link below to learn more about the role and take the AI Interview to begin your application journey:


Role Overview

We are looking for a highly skilled Big Data Engineer (Scala) with strong expertise in Scala, Spark, Python, NiFi, and Apache Kafka to join our data engineering team. The ideal candidate will have a proven track record in building, scaling, and optimizing big data pipelines , and hands-on experience in distributed data systems and cloud-based solutions.


Key Responsibilities

- Design, develop, and optimize large-scale data pipelines and distributed data processing systems.

- Work extensively with Scala, Spark (PySpark), and Python for data processing and transformation.

- Develop and integrate streaming solutions using Apache Kafka and orchestration tools like NiFi / Airflow .

- Write efficient queries and perform data analysis using Jupyter Notebooks and SQL .

- Collaborate with cross-functional teams to design scalable cloud-based data architectures .

- Ensure delivery of high-quality code through code reviews, performance tuning, and best practices .

- Build monitoring and alerting systems leveraging Splunk or equivalent tools .

- Participate in CI/CD workflows using Git, Jenkins, and other DevOps tools.

- Contribute to product development with a focus on scalability, maintainability, and performance.


Mandatory Skills

- Scala – Minimum 3+ years of hands-on experience.

- Strong expertise in Spark (PySpark) and Python .

- Hands-on experience with Apache Kafka .

- Knowledge of NiFi / Airflow for orchestration.

- Strong experience in Distributed Data Systems (5+ years) .

- Proficiency in SQL and query optimization.

- Good understanding of Cloud Architecture .


Preferred Skills

- Exposure to messaging technologies like Apache Kafka or equivalent.

- Experience in designing intuitive, responsive UIs for data analytics visualization.

- Familiarity with Splunk or other monitoring/alerting solutions .

- Hands-on experience with CI/CD tools (Git, Jenkins).

- Strong grasp of software engineering concepts, data modeling, and optimization techniques .

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer - Scala

Kochi, Kerala Idyllic Services

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Big Data Engineer – Scala

Location: Bangalore, Chennai, Gurgaon, Pune, Mumbai.

Experience: 7–10 Years (Minimum 3+ years in Scala)

Notice Period: Immediate to 30 Days

Mode of Work: Hybrid


Click the link below to learn more about the role and take the AI Interview to begin your application journey:


Role Overview

We are looking for a highly skilled Big Data Engineer (Scala) with strong expertise in Scala, Spark, Python, NiFi, and Apache Kafka to join our data engineering team. The ideal candidate will have a proven track record in building, scaling, and optimizing big data pipelines , and hands-on experience in distributed data systems and cloud-based solutions.


Key Responsibilities

- Design, develop, and optimize large-scale data pipelines and distributed data processing systems.

- Work extensively with Scala, Spark (PySpark), and Python for data processing and transformation.

- Develop and integrate streaming solutions using Apache Kafka and orchestration tools like NiFi / Airflow .

- Write efficient queries and perform data analysis using Jupyter Notebooks and SQL .

- Collaborate with cross-functional teams to design scalable cloud-based data architectures .

- Ensure delivery of high-quality code through code reviews, performance tuning, and best practices .

- Build monitoring and alerting systems leveraging Splunk or equivalent tools .

- Participate in CI/CD workflows using Git, Jenkins, and other DevOps tools.

- Contribute to product development with a focus on scalability, maintainability, and performance.


Mandatory Skills

- Scala – Minimum 3+ years of hands-on experience.

- Strong expertise in Spark (PySpark) and Python .

- Hands-on experience with Apache Kafka .

- Knowledge of NiFi / Airflow for orchestration.

- Strong experience in Distributed Data Systems (5+ years) .

- Proficiency in SQL and query optimization.

- Good understanding of Cloud Architecture .


Preferred Skills

- Exposure to messaging technologies like Apache Kafka or equivalent.

- Experience in designing intuitive, responsive UIs for data analytics visualization.

- Familiarity with Splunk or other monitoring/alerting solutions .

- Hands-on experience with CI/CD tools (Git, Jenkins).

- Strong grasp of software engineering concepts, data modeling, and optimization techniques .

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data scientists Jobs in Kochi !

Big Data Engineer - Scala

Ernakulam, Kerala Idyllic Services

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Big Data Engineer – Scala

Location: Bangalore, Chennai, Gurgaon, Pune, Mumbai.

Experience: 7–10 Years (Minimum 3+ years in Scala)

Notice Period: Immediate to 30 Days

Mode of Work: Hybrid


Click the link below to learn more about the role and take the AI Interview to begin your application journey:


Role Overview

We are looking for a highly skilled Big Data Engineer (Scala) with strong expertise in Scala, Spark, Python, NiFi, and Apache Kafka to join our data engineering team. The ideal candidate will have a proven track record in building, scaling, and optimizing big data pipelines , and hands-on experience in distributed data systems and cloud-based solutions.


Key Responsibilities

- Design, develop, and optimize large-scale data pipelines and distributed data processing systems.

- Work extensively with Scala, Spark (PySpark), and Python for data processing and transformation.

- Develop and integrate streaming solutions using Apache Kafka and orchestration tools like NiFi / Airflow .

- Write efficient queries and perform data analysis using Jupyter Notebooks and SQL .

- Collaborate with cross-functional teams to design scalable cloud-based data architectures .

- Ensure delivery of high-quality code through code reviews, performance tuning, and best practices .

- Build monitoring and alerting systems leveraging Splunk or equivalent tools .

- Participate in CI/CD workflows using Git, Jenkins, and other DevOps tools.

- Contribute to product development with a focus on scalability, maintainability, and performance.


Mandatory Skills

- Scala – Minimum 3+ years of hands-on experience.

- Strong expertise in Spark (PySpark) and Python .

- Hands-on experience with Apache Kafka .

- Knowledge of NiFi / Airflow for orchestration.

- Strong experience in Distributed Data Systems (5+ years) .

- Proficiency in SQL and query optimization.

- Good understanding of Cloud Architecture .


Preferred Skills

- Exposure to messaging technologies like Apache Kafka or equivalent.

- Experience in designing intuitive, responsive UIs for data analytics visualization.

- Familiarity with Splunk or other monitoring/alerting solutions .

- Hands-on experience with CI/CD tools (Git, Jenkins).

- Strong grasp of software engineering concepts, data modeling, and optimization techniques .

This advertiser has chosen not to accept applicants from your region.

Scala Big Data Lead Engineer - 7 YoE - Immediate Joiner - Any UST Location

Ernakulam, Kerala UST

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

If you are highly interested and available immediately , please submit your resume along with your total experience, current CTC, notice period, and current location details to


Key Responsibilities:

  • Design, develop, and optimize data pipelines and ETL workflows .
  • Work with Apache Hadoop, Airflow, Kubernetes, and Containers to streamline data processing.
  • Implement data analytics and mining techniques to drive business insights.
  • Manage cloud-based big data solutions on GCP and Azure .
  • Troubleshoot Hadoop log files and work with multiple data processing engines for scalable data solutions.

Required Skills & Qualifications:

  • Proficiency in Scala, Spark, PySpark, Python, and SQL .
  • Strong hands-on experience with Hadoop ecosystem, Hive, Pig, and MapReduce .
  • Experience in ETL, Data Warehouse Design, and Data Cleansing .
  • Familiarity with data pipeline orchestration tools like Apache Airflow .
  • Knowledge of Kubernetes, Containers, and cloud platforms such as GCP and Azure .

If you are a seasoned big data engineer with a passion for Scala and cloud technologies , we invite you to apply for this exciting opportunity!

This advertiser has chosen not to accept applicants from your region.

Scala Big Data Lead Engineer - 7 YoE - Immediate Joiner - Any UST Location

Kochi, Kerala UST

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

If you are highly interested and available immediately , please submit your resume along with your total experience, current CTC, notice period, and current location details to


Key Responsibilities:

  • Design, develop, and optimize data pipelines and ETL workflows .
  • Work with Apache Hadoop, Airflow, Kubernetes, and Containers to streamline data processing.
  • Implement data analytics and mining techniques to drive business insights.
  • Manage cloud-based big data solutions on GCP and Azure .
  • Troubleshoot Hadoop log files and work with multiple data processing engines for scalable data solutions.

Required Skills & Qualifications:

  • Proficiency in Scala, Spark, PySpark, Python, and SQL .
  • Strong hands-on experience with Hadoop ecosystem, Hive, Pig, and MapReduce .
  • Experience in ETL, Data Warehouse Design, and Data Cleansing .
  • Familiarity with data pipeline orchestration tools like Apache Airflow .
  • Knowledge of Kubernetes, Containers, and cloud platforms such as GCP and Azure .

If you are a seasoned big data engineer with a passion for Scala and cloud technologies , we invite you to apply for this exciting opportunity!

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Scientists Jobs View All Jobs in Kochi