4,309 Research Data jobs in India

Research Data Engineer

Bengaluru, Karnataka Tookitaki

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Position Overview

Job Title: Software Development Engineer 2

Department: Technology

Location: Bangalore, India

Reporting To: Senior Research Manager - Data



Position Purpos e

The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence.

The role exists to bridge research and engineering by

  • Designing and executing experiments on large, complex datasets
  • Prototyping new data-driven algorithms for financial crime detection and compliance automation.
  • Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
  • Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platform.


Key Responsibilities.

Applied Research & Prototyping.

  • Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
  • Build experimental frameworks to test hypotheses using real-world financial datase
  • Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.

Data Engineering for Research

  • Develop data ingestion, transformation, and exploration pipelines to support experimentation.
  • Work with structured, semi-structured, and unstructured datasets at scale.
  • Ensure reproducibility and traceability of experiments

Algorithm Evaluation & Optimization.

  • Evaluate research prototypes using statistical, ML, and domain-specific metrics.
  • Optimize algorithms for accuracy, latency, and scalability.
  • Conduct robustness, fairness, and bias evaluations on mode.

Collaboration & Integration

  • Partner with data scientists to transition validated research outcomes into production-ready to code.
  • Work closely with product managers to align research priorities with business goals
  • Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.

Documentation & Knowledge Sharing

  • Document experimental designs, results, and lessons learned
  • Share best practices across engineering and data science teams to accelerate innovation


Qualifications and Skills

EducationRequired:

  • Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
  • Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field

Experience

  • Minimum 4–7 years in data-centric engineering or applied research roles.
  • Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
  • Experience in financial services, compliance, or fraud detection is a strong plus.


Technical Expertise.

  • Progra mming: Proficiency in Scala, Java, or Python
  • Data Proce ssing: Experience with Spark, Hadoop, and Flink
  • ML/Research Frame works: Hands-on with TensorFlow, PyTorch, or Scikit-learn
  • Data bases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
  • Cloud Plat forms: Experience with AWS (preferred) or GCP for research and data pipelines.
  • Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
  • Application Deploy ment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
  • Streaming framew orks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
  • Data Lakeh ouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.


Soft Skils

  • SkillsStrong analytical and problem-solving abilities.
  • Clear concise communication skills for cross-functional collaboration.
  • Adaptability in fast-paced, evolving environments.
  • Curiosity-driven with a bias towards experimentation and iteration.


Key Competencies

  • Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
  • Collab oration: Works effectively with researchers, engineers, and business stakeholders.
  • Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
  • Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
  • Own ership: Drives research projects from concept to prototype to production.
  • Adapta bility: Thrives in ambiguity and rapidly changing priorities.
  • Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
  • Experience in compliance or financial services domains.

Success Metrics

  • Research to Production Co nversion: % of validated research projects integrated into TookiTaki’s platform
  • Model Performan ce Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
  • Efficiency of Research P ipelines: Reduced time from ideation to prototype completion.
  • Collaboratio n Impact: Positive feedback from cross-functional teams on research integration.

Benefits

  • Competiti ve Salary: Aligned with industry standards and experience.
  • Professional De velopment: Access to training in big data, cloud computing, and data integration tools.
  • Comprehensive Benefits: Health insurance and flexible working options.
  • Growth Oppo rtunities: Career progression within Tookitaki’s rapidly expanding Services Delive ry team.



Introducing:

TookitakiTookitaki: The Trust Layer for Financial

Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)


compliance.How We Build Trust: Our Unique Value Propositions

  1. AFC Ecosystem – Community-Driven Financial Crime Protection
  2. The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
  3. protection.FinCense – End-to-End Compliance.
  4. PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.


Industry Recognition and Global Impact

Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis

RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.

This advertiser has chosen not to accept applicants from your region.

Research Data Engineer

Bengaluru, Karnataka Tookitaki

Posted today

Job Viewed

Tap Again To Close

Job Description

Position Overview

Job Title: Software Development Engineer 2

Department: Technology

Location: Bangalore, India

Reporting To: Senior Research Manager - Data

Position Purpose

The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence.

The role exists to bridge research and engineering by

  • Designing and executing experiments on large, complex datasets
  • Prototyping new data-driven algorithms for financial crime detection and compliance automation.
  • Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
  • Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platform.

Key Responsibilities.

Applied Research & Prototyping.

  • Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
  • Build experimental frameworks to test hypotheses using real-world financial datase
  • Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.

Data Engineering for Research

  • Develop data ingestion, transformation, and exploration pipelines to support experimentation.
  • Work with structured, semi-structured, and unstructured datasets at scale.
  • Ensure reproducibility and traceability of experiments

Algorithm Evaluation & Optimization.

  • Evaluate research prototypes using statistical, ML, and domain-specific metrics.
  • Optimize algorithms for accuracy, latency, and scalability.
  • Conduct robustness, fairness, and bias evaluations on mode.

Collaboration & Integration

  • Partner with data scientists to transition validated research outcomes into production-ready to code.
  • Work closely with product managers to align research priorities with business goals
  • Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.

Documentation & Knowledge Sharing

  • Document experimental designs, results, and lessons learned
  • Share best practices across engineering and data science teams to accelerate innovation

Qualifications and Skills

EducationRequired:

  • Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
  • Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field

Experience

  • Minimum 4–7 years in data-centric engineering or applied research roles.
  • Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
  • Experience in financial services, compliance, or fraud detection is a strong plus.

Technical Expertise.

  • Programming: Proficiency in Scala, Java, or Python
  • Data Processing: Experience with Spark, Hadoop, and Flink
  • ML/Research Frameworks: Hands-on with TensorFlow, PyTorch, or Scikit-learn
  • Databases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
  • Cloud Platforms: Experience with AWS (preferred) or GCP for research and data pipelines.
  • Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
  • Application Deployment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
  • Streaming frameworks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
  • Data Lakehouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.

Soft Skils

  • SkillsStrong analytical and problem-solving abilities.
  • Clear concise communication skills for cross-functional collaboration.
  • Adaptability in fast-paced, evolving environments.
  • Curiosity-driven with a bias towards experimentation and iteration.

Key Competencies

  • Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
  • Collaboration: Works effectively with researchers, engineers, and business stakeholders.
  • Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
  • Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
  • Ownership: Drives research projects from concept to prototype to production.
  • Adaptability: Thrives in ambiguity and rapidly changing priorities.
  • Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
  • Experience in compliance or financial services domains.

Success Metrics

  • Research to Production Conversion: % of validated research projects integrated into TookiTaki’s platform
  • Model Performance Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
  • Efficiency of Research Pipelines: Reduced time from ideation to prototype completion.
  • Collaboration Impact: Positive feedback from cross-functional teams on research integration.

Benefits

  • Competitive Salary: Aligned with industry standards and experience.
  • Professional Development: Access to training in big data, cloud computing, and data integration tools.
  • Comprehensive Benefits: Health insurance and flexible working options.
  • Growth Opportunities: Career progression within Tookitaki’s rapidly expanding Services Delivery team.

Introducing:

TookitakiTookitaki: The Trust Layer for Financial

Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)

compliance.How We Build Trust: Our Unique Value Propositions

  • AFC Ecosystem – Community-Driven Financial Crime Protection
  • The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
  • protection.FinCense – End-to-End Compliance.
  • PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.

Industry Recognition and Global Impact

Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis

RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.

This advertiser has chosen not to accept applicants from your region.

Research Data Engineer

Bengaluru, Karnataka Tookitaki

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Position Overview
Job Title: Software Development Engineer 2
Department: Technology
Location: Bangalore, India
Reporting To: Senior Research Manager - Data

Position Purpos e
The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence .
The role exists to bridge research and engineering by
Designing and executing experiments on large, complex datasets
Prototyping new data-driven algorithms for financial crime detection and compliance automation.
Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platfor m.

Key Responsibilities.
Applied Research & Prototyping.
Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
Build experimental frameworks to test hypotheses using real-world financial datase
Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.
Data Engineering for Research
Develop data ingestion, transformation, and exploration pipelines to support experimentation.
Work with structured, semi-structured, and unstructured datasets at scale.
Ensure reproducibility and traceability of experiments
Algorithm Evaluation & Optimization.
Evaluate research prototypes using statistical, ML, and domain-specific metrics.
Optimize algorithms for accuracy, latency, and scalability.
Conduct robustness, fairness, and bias evaluations on mode.
Collaboration & Integration
Partner with data scientists to transition validated research outcomes into production-ready to code.
Work closely with product managers to align research priorities with business goals
Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.
Documentation & Knowledge Sharing
Document experimental designs, results, and lessons learned
Share best practices across engineering and data science teams to accelerate innovation

Qualifications and Skills
EducationRequired:
Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field
Experience
Minimum 4–7 years in data-centric engineering or applied research roles.
Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
Experience in financial services, compliance, or fraud detection is a strong plus.

Technical Expertise.
Progra mming: Proficiency in Scala, Java, or Python
Data Proce ssing: Experience with Spark, Hadoop, and Flink
ML/Research Frame works: Hands-on with TensorFlow, PyTorch, or Scikit-learn
Data bases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
Cloud Plat forms: Experience with AWS (preferred) or GCP for research and data pipelines.
Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
Application Deploy ment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
Streaming framew orks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
Data Lakeh ouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.

Soft Skils
SkillsStrong analytical and problem-solving abilities.
Clear concise communication skills for cross-functional collaboration.
Adaptability in fast-paced, evolving environments.
Curiosity-driven with a bias towards experimentation and iteration.

Key Competencies
Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
Collab oration: Works effectively with researchers, engineers, and business stakeholders.
Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
Own ership: Drives research projects from concept to prototype to production.
Adapta bility: Thrives in ambiguity and rapidly changing priorities.
Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
Experience in compliance or financial services domains.

Success Metrics
Research to Production Co nversion: % of validated research projects integrated into TookiTaki’s platform
Model Performan ce Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
Efficiency of Research P ipelines: Reduced time from ideation to prototype completion.
Collaboratio n Impact: Positive feedback from cross-functional teams on research integration.
Benefits
Competiti ve Salary: Aligned with industry standards and experience.
Professional De velopment: Access to training in big data, cloud computing, and data integration tools.
Comprehensive Benefits: Health insurance and flexible working options.
Growth Oppo rtunities: Career progression within Tookitaki’s rapidly expanding Services Deliv e ry team.

Introducing:
TookitakiTookitaki: The Trust Layer for Financial
Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)

compliance.How We Build Trust: Our Unique Value Propositions
AFC Ecosystem – Community-Driven Financial Crime Protection
The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
protection.FinCense – End-to-End Compliance.
PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.

Industry Recognition and Global Impact
Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis
RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.
This advertiser has chosen not to accept applicants from your region.

Research Data Manager

Kolkata, West Bengal Confidential

Posted today

Job Viewed

Tap Again To Close

Job Description

Mission

Drive the development of intuitive, secure, and scalable front-end systems that bridge complex industrial data streams with actionable insights. Collaborate with IT and data teams to create user-friendly digital interfaces and architect cloud-based solutions that connect asset condition data (e.g. scan of equipment – wear measurement before/ after) with 3D scan analytics for predictive maintenance and measurable value to industrial customers.

This newly created role is key to internalizing capabilities currently outsourced to suppliers.

Calderys Group

Calderys is a leading global solution provider for industries operating in high temperature conditions . The Group specializes in thermal protection for industrial equipment with a wide range of refractory products, and advanced solutions to enhance steel casting, metallurgical fluxes and molding processes.

As an international business with a presence in more than 30 countries and a strong footprint in the Americas through the brand HWI, a member of Calderys, we offer our employees a world of opportunity.

With a legacy of over 150 years, and an unwavering commitment to excellence, we continue to shape our future through teamwork, customer-centricity and a proactive mindset. We are the vital partner of all high temperature industries and our purpose places sustainability and innovation at the heart of our business. It reflects our reason for existing: to support our customers building a better world through sustainable solutions.

Our values are a driving force in this purpose: We are tenacious, accountable, multicultural and authentic.

In our company, performance is recognized and learning is promoted. Our services and solutions depend upon the expertise and commitment of our employees. So we ensure that they have the scope and opportunities to develop their potential within a diverse, inclusive and collaborative setting. It is an environment for people to grow, where every day is a new day and more exciting than the last.

Calderys - Forged in legacy. Fueled by excellence.

For more information, please visit Calderys.com


Skills Required
Predictive Maintenance
This advertiser has chosen not to accept applicants from your region.

Research Data Engineer

Bengaluru, Karnataka Tookitaki

Posted today

Job Viewed

Tap Again To Close

Job Description

Position Overview

Job Title: Software Development Engineer 2

Department: Technology

Location: Bangalore, India

Reporting To: Senior Research Manager - Data



Position Purpos e

The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence.

The role exists to bridge research and engineering by

  • Designing and executing experiments on large, complex datasets
  • Prototyping new data-driven algorithms for financial crime detection and compliance automation.
  • Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
  • Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platform.


Key Responsibilities.

Applied Research & Prototyping.

  • Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
  • Build experimental frameworks to test hypotheses using real-world financial datase
  • Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.

Data Engineering for Research

  • Develop data ingestion, transformation, and exploration pipelines to support experimentation.
  • Work with structured, semi-structured, and unstructured datasets at scale.
  • Ensure reproducibility and traceability of experiments

Algorithm Evaluation & Optimization.

  • Evaluate research prototypes using statistical, ML, and domain-specific metrics.
  • Optimize algorithms for accuracy, latency, and scalability.
  • Conduct robustness, fairness, and bias evaluations on mode.

Collaboration & Integration

  • Partner with data scientists to transition validated research outcomes into production-ready to code.
  • Work closely with product managers to align research priorities with business goals
  • Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.

Documentation & Knowledge Sharing

  • Document experimental designs, results, and lessons learned
  • Share best practices across engineering and data science teams to accelerate innovation


Qualifications and Skills

EducationRequired:

  • Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
  • Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field

Experience

  • Minimum 4–7 years in data-centric engineering or applied research roles.
  • Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
  • Experience in financial services, compliance, or fraud detection is a strong plus.


Technical Expertise.

  • Progra mming: Proficiency in Scala, Java, or Python
  • Data Proce ssing: Experience with Spark, Hadoop, and Flink
  • ML/Research Frame works: Hands-on with TensorFlow, PyTorch, or Scikit-learn
  • Data bases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
  • Cloud Plat forms: Experience with AWS (preferred) or GCP for research and data pipelines.
  • Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
  • Application Deploy ment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
  • Streaming framew orks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
  • Data Lakeh ouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.


Soft Skils

  • SkillsStrong analytical and problem-solving abilities.
  • Clear concise communication skills for cross-functional collaboration.
  • Adaptability in fast-paced, evolving environments.
  • Curiosity-driven with a bias towards experimentation and iteration.


Key Competencies

  • Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
  • Collab oration: Works effectively with researchers, engineers, and business stakeholders.
  • Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
  • Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
  • Own ership: Drives research projects from concept to prototype to production.
  • Adapta bility: Thrives in ambiguity and rapidly changing priorities.
  • Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
  • Experience in compliance or financial services domains.

Success Metrics

  • Research to Production Co nversion: % of validated research projects integrated into TookiTaki’s platform
  • Model Performan ce Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
  • Efficiency of Research P ipelines: Reduced time from ideation to prototype completion.
  • Collaboratio n Impact: Positive feedback from cross-functional teams on research integration.

Benefits

  • Competiti ve Salary: Aligned with industry standards and experience.
  • Professional De velopment: Access to training in big data, cloud computing, and data integration tools.
  • Comprehensive Benefits: Health insurance and flexible working options.
  • Growth Oppo rtunities: Career progression within Tookitaki’s rapidly expanding Services Delive ry team.



Introducing:

TookitakiTookitaki: The Trust Layer for Financial

Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)


compliance.How We Build Trust: Our Unique Value Propositions

  1. AFC Ecosystem – Community-Driven Financial Crime Protection
  2. The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
  3. protection.FinCense – End-to-End Compliance.
  4. PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.


Industry Recognition and Global Impact

Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis

RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.

This advertiser has chosen not to accept applicants from your region.

Research Data Engineer

Bangalore, Karnataka Tookitaki

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Position Overview

Job Title: Software Development Engineer 2

Department: Technology

Location: Bangalore, India

Reporting To: Senior Research Manager - Data



Position Purpos e

The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence.

The role exists to bridge research and engineering by

  • Designing and executing experiments on large, complex datasets
  • Prototyping new data-driven algorithms for financial crime detection and compliance automation.
  • Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
  • Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platform.


Key Responsibilities.

Applied Research & Prototyping.

  • Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
  • Build experimental frameworks to test hypotheses using real-world financial datase
  • Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.

Data Engineering for Research

  • Develop data ingestion, transformation, and exploration pipelines to support experimentation.
  • Work with structured, semi-structured, and unstructured datasets at scale.
  • Ensure reproducibility and traceability of experiments

Algorithm Evaluation & Optimization.

  • Evaluate research prototypes using statistical, ML, and domain-specific metrics.
  • Optimize algorithms for accuracy, latency, and scalability.
  • Conduct robustness, fairness, and bias evaluations on mode.

Collaboration & Integration

  • Partner with data scientists to transition validated research outcomes into production-ready to code.
  • Work closely with product managers to align research priorities with business goals
  • Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.

Documentation & Knowledge Sharing

  • Document experimental designs, results, and lessons learned
  • Share best practices across engineering and data science teams to accelerate innovation


Qualifications and Skills

EducationRequired:

  • Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
  • Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field

Experience

  • Minimum 4–7 years in data-centric engineering or applied research roles.
  • Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
  • Experience in financial services, compliance, or fraud detection is a strong plus.


Technical Expertise.

  • Progra mming: Proficiency in Scala, Java, or Python
  • Data Proce ssing: Experience with Spark, Hadoop, and Flink
  • ML/Research Frame works: Hands-on with TensorFlow, PyTorch, or Scikit-learn
  • Data bases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
  • Cloud Plat forms: Experience with AWS (preferred) or GCP for research and data pipelines.
  • Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
  • Application Deploy ment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
  • Streaming framew orks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
  • Data Lakeh ouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.


Soft Skils

  • SkillsStrong analytical and problem-solving abilities.
  • Clear concise communication skills for cross-functional collaboration.
  • Adaptability in fast-paced, evolving environments.
  • Curiosity-driven with a bias towards experimentation and iteration.


Key Competencies

  • Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
  • Collab oration: Works effectively with researchers, engineers, and business stakeholders.
  • Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
  • Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
  • Own ership: Drives research projects from concept to prototype to production.
  • Adapta bility: Thrives in ambiguity and rapidly changing priorities.
  • Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
  • Experience in compliance or financial services domains.

Success Metrics

  • Research to Production Co nversion: % of validated research projects integrated into TookiTaki’s platform
  • Model Performan ce Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
  • Efficiency of Research P ipelines: Reduced time from ideation to prototype completion.
  • Collaboratio n Impact: Positive feedback from cross-functional teams on research integration.

Benefits

  • Competiti ve Salary: Aligned with industry standards and experience.
  • Professional De velopment: Access to training in big data, cloud computing, and data integration tools.
  • Comprehensive Benefits: Health insurance and flexible working options.
  • Growth Oppo rtunities: Career progression within Tookitaki’s rapidly expanding Services Delive ry team.



Introducing:

TookitakiTookitaki: The Trust Layer for Financial

Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)


compliance.How We Build Trust: Our Unique Value Propositions

  • AFC Ecosystem – Community-Driven Financial Crime Protection
  • The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
  • protection.FinCense – End-to-End Compliance.
  • PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.


Industry Recognition and Global Impact

Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis

RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.

This advertiser has chosen not to accept applicants from your region.

Senior Team Leader, Research & Data Analysis

Secunderabad, Andhra Pradesh Confidential

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Accountabilities and main responsibilities

Strategic Focus

  • Lead transformation efforts, audits, and business continuity projects.
  • Drive innovation, resilience, and adaptability within the team s operations.
  • Partner with global stakeholders to align delivery with business goals and strategic direction.
  • Drive standardization and consistency in data collection methodologies across asset classes.
  • Support quality improvement through automation, workflow enhancements, and issue resolution frameworks.
  • Encourage cross-training to minimize risk and reduce dependency, building a more agile and capable team.

Operational Management

  • Review market announcements, trading volumes, and intelligence for in-depth analysis.
  • Analyse large datasets to identify trends, patterns, and anomalies in shareholder transactions and behaviours.
  • Oversee daily operations, including shift planning, floor-level support, and resource allocation.
  • Monitor productivity and quality for both the team and individual contributions.
  • Handle task prioritization, leave planning, and operational escalations.
  • Track and report daily, weekly, and monthly operational metrics and compliance updates.
  • Participate in calls with global teams and facilitate information flow between process owners and analysts.

People Leadership

  • Manage a team of 10+ members, ensuring timely delivery, process adherence, and consistent upskilling.
  • Conduct performance discussions, assign goals, and support career development paths.
  • Provide coaching and feedback based on individual needs and business priorities.
  • Encourage a high-morale, collaborative team culture through motivation and fair communication.
  • Organize knowledge-sharing sessions and support group/individual training needs.

Governance & Risk

  • Ensure adherence to internal controls, risk frameworks, and documentation protocols.
  • Maintain audit-ready trails of all data validation and review processes.
  • Identify operational risks and work with stakeholders to proactively mitigate them.
  • Promote data confidentiality and compliance with external regulatory requirements.

The above list of key accountabilities is not an exhaustive list and may change from time-to-time based on business needs.

Experience & Personal Attributes

  • Bachelor s/master s degree in finance, Business, Economics, or a related discipline.
  • 7+ years of research or data analysis experience, including 3+ years of team management.
  • Excellent analytical and problem-solving skills, with the ability to interpret complex data and provide actionable insights
  • Hands-on experience in validating data from public domain sources such as filings, reports, and databases.
  • Strong knowledge of operational workflow tools, documentation practices, and audit readiness.
  • Excellent interpersonal, stakeholder management, and communication skills.
  • Proficient in Excel; knowledge of VBA or system workflows is an added advantage.
  • Ability to interpret data using tools like Excel, Power BI, or Tableau
  • CFA (any level) is a plus.
  • Detail-oriented, structured, and capable of working within dynamic and time-sensitive delivery environments.
  • Team player with the ability to lead by example and adapt to evolving business needs.

Skills Required
Excel, Power Bi, Tableau, Vba, Audits, Strategic Direction
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Research data Jobs in India !

Senior Team Leader, Research & Data Analysis

Link Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Overview The Investor Relations team delivers high-quality reports to listed companies across multiple jurisdictions. Team Leader plays a key role in ensuring the accuracy, quality, and timely delivery of shareholder analysis reports for a portfolio of clients. This role requires strong leadership to oversee and coordinate workloads within the team, ensuring that key deliverables meet stakeholder expectations. Leader will collaborate closely with the stakeholders of Investor Relations, Client Relations, Business Development, Custodians and IT functions to maintain seamless communication and alignment. Key Accountabilities and main responsibilities Strategic Focus
  • Lead transformation efforts, audits, and business continuity projects. 
  • Drive innovation, resilience, and adaptability within the team’s operations. 
  • Partner with global stakeholders to align delivery with business goals and strategic direction. 
  • Drive standardization and consistency in data collection methodologies across asset classes. 
  • Support quality improvement through automation, workflow enhancements, and issue resolution frameworks.
  • Encourage cross-training to minimize risk and reduce dependency, building a more agile and capable team.
  • Operational Management
  • Review market announcements, trading volumes, and intelligence for in-depth analysis.
  • Analyse large datasets to identify trends, patterns, and anomalies in shareholder transactions and behaviours.
  • Oversee daily operations, including shift planning, floor-level support, and resource allocation.
  • Monitor productivity and quality for both the team and individual contributions.
  • Handle task prioritization, leave planning, and operational escalations.
  • Track and report daily, weekly, and monthly operational metrics and compliance updates.
  • Participate in calls with global teams and facilitate information flow between process owners and analysts.
  • People Leadership
  • Manage a team of 10+ members, ensuring timely delivery, process adherence, and consistent upskilling.
  • Conduct performance discussions, assign goals, and support career development paths.
  • Provide coaching and feedback based on individual needs and business priorities.
  • Encourage a high-morale, collaborative team culture through motivation and fair communication.
  • Organize knowledge-sharing sessions and support group/individual training needs.
  • Governance & Risk
  • Ensure adherence to internal controls, risk frameworks, and documentation protocols.
  •  Maintain audit-ready trails of all data validation and review processes.
  • dentify operational risks and work with stakeholders to proactively mitigate them.
  • Promote data confidentiality and compliance with external regulatory requirements.
  • The above list of key accountabilities is not an exhaustive list and may change from time-to-time based on business needs. Experience & Personal Attributes
  • Bachelor’s/master’s degree in finance, Business, Economics, or a related discipline.
  • 7+ years of research or data analysis experience, including 3+ years of team management.
  • Excellent analytical and problem-solving skills, with the ability to interpret complex data and provide actionable insights
  • Hands-on experience in validating data from public domain sources such as filings, reports, and databases.
  • Strong knowledge of operational workflow tools, documentation practices, and audit readiness.
  • Excellent interpersonal, stakeholder management, and communication skills.
  • Proficient in Excel; knowledge of VBA or system workflows is an added advantage.
  • Ability to interpret data using tools like Excel, Power BI, or Tableau
  • CFA (any level) is a plus.
  • Detail-oriented, structured, and capable of working within dynamic and time-sensitive delivery environments.
  • Team player with the ability to lead by example and adapt to evolving business needs.
  • Work Schedule & Environment:
  • The role supports APAC or EMEA shifts on a rotational basis:
  • APAC Shift: Starts at 5:30 AM IST
  • EMEA Shift: Starts at 12:30 PM IST
  • Flexibility is required to meet tight deadlines and fluctuating business needs.
    This advertiser has chosen not to accept applicants from your region.

    Healthcare Research & Data Analyst

    Bengaluru, Karnataka Clarivate

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    About Clarivate

  • Clarivate provides innovative data and analytical solutions to the largest biopharmaceutical and medical technology companies in the world. Clarivate’s Medtech Data team harnesses real-world healthcare data and identifies meaningful insights from large data and metadata sources to help medical device companies make some of their most important business decisions.
  • Who are you?

  • You are passionate about data and have at least 2 years of hands-on experience wrangling through large data using SQL/ Python.
  • You are also an effective communicator who can explain complex ideas using clear and concise language, including through written communication.
  • You are comfortable collaborating with a diverse group of internal colleagues, including subject matter experts, product managers, sales leaders, technical experts, and other client-facing analysts.
  • You are solution-oriented and understand the importance of timely execution while juggling multiple priorities. 
  • What will you do?

  • Understand the worldview and pain points of Clarivate customers, working closely with the stake holders as a problem-solver.
  • Effectively build, mine, and manage multiple datasets, which will be required for market modelling and analysis. 
  • Maintain existing Medtech product catalogs.
  • Research and understand novel device markets, including major competitors, uses, and product segmentation.
  • Evaluate data outputs for market trends and draw insights; correct any potential errors and issues.
  • Identify opportunities and issues for data analysis and experiments, with bias towards driving customer delight.
  • Work with clients to understand the business requirements and provide data driven insights.
  • Contribute your vision; influence the evolution of our products, data models, and data usage strategy.
  • What do you know?

  • You have strong quantitative foundations, as evidenced by your educational and professional background. 
  • You are more than proficient with Excel, SQL and want to continue to improve your skills.
  • You have a background and/or interest in Life Sciences and are keen to learn a lot more about medical devices and supplies in Latin American region.
  • You know how to deliver an effective presentation.
  • Requirements:

  • Expertise in understanding data variables and connecting the dots in various datasets.
  • Expertise in handling and manipulating large datasets.
  • Proficiency with written and oral communications and must be able to communicate complex quantitative ideas to internal and external stakeholders.
  • Should have handled short/long term projects end to end.
  • Skills:

  • Expertise in MS SQL.
  • Strong MS Office and Excel skills is mandatory.
  • Analytical Skills
  • Ability to do secondary research and synthesize the findings.
  • Problem solving ability.
  • Education:

  • Any graduate/ Post-graduate, B.E or M.Sc. in the disciplines of Biotechnology, Medical Electronics, Pharmacy preferred.
  • Preferred (Good to have) skills:

  • Understanding of Medtech data and Healthcare in the Latin American countries
  • Exposure to Medtech or Claims/Pharmacy data.
  • Experience working with data from data vendors.
  • Expertise in anyone (or more) of the data analysis tools/languages such as – R/Python is a plus.
  • Work Mode: Hybrid, Monday to Friday 12:00 pm to 8:00 pm

    At Clarivate, we are committed to providing equal employment opportunities for all qualified persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations.

    This advertiser has chosen not to accept applicants from your region.

    Human Research Data Analyst

    Gurugram, Uttar Pradesh Keywords Studios

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Keywords Studios , established in Dublin in 1998, now has 70+ studios across Europe, North and South America and Asia with 11,000 employee strength located across 5 continents and 23 countries. The company provides a complete outsourced game art, engineering, testing, audio and localization service for all Console, PC, Handheld and Mobile content, to many of the biggest names in games and interactive entertainment, working on thousands of titles including many of the best-selling titles of the past few years.

    Keywords Studios is comprised of many individual brands, all with something unique to offer our clients. The studios are integrated into the Group by Service Line and use the operating systems and tools deployed by those services lines to ensure people and projects can operate across studios and across geographies.

    For more info please refer to

    Requirements

    In this role your responsibilities will involve utilizing your expertise in AI to contribute to the development of optimized AI solutions.

    Responsibilities:

    Representing various clients’ software to provide enhanced AI solutions to boost their business productivity.

    Create Use case scenarios derived from client solutions utilizing tailored AI solutions.

    Execute delivering your thoughts on the opportunities for improvement.

    Maintain and improve processes that support the creation of Use Cases.

    Attend meetings as appropriate.

    Independently identify operational inefficiencies and work to mitigate them.

    Assist with other duties as needed.

    Requirements:

    Masters in Cognitive Science, Computer Science or other degree associated with AI

    • Technical aptitude or experience working with AI
    • Experience with Python, SQL, typescript (preferred)
    • Data modeling (preferred)
    • Experience working with and creating Data Visualizations
    • Strong attention to detail
    • Strong Organization skills
    • Critical thinking and problem-solving skills
    • Strong Analytical skills
    • Process Improvement experience
    • Strong aptitude of working with Google sheets, Zoom, and Slack
    • Exemplify the quality of having a "Proactive Approach," attitude which includes a high level of accountability, transparency, and teamwork first & foremost
    • Ability to learn on the job

    Role Information: IN

    Location: Asia Pacific

    Studio: Keywords India

    Area of Work: QA Testing Services

    Service: Globalize

    Employment Type: Full Time

    Working Pattern: Work from Office

    Benefits

    • Cab Facility within Hiring Zones
    • Medical Insurance, Term Insurance and Accidental Insurance
    • Lunch / Dinner provided at subsidized rates
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Research Data Jobs