4,309 Research Data jobs in India
Research Data Engineer
Posted 2 days ago
Job Viewed
Job Description
Position Overview
Job Title: Software Development Engineer 2
Department: Technology
Location: Bangalore, India
Reporting To: Senior Research Manager - Data
Position Purpos e
The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence.
The role exists to bridge research and engineering by
- Designing and executing experiments on large, complex datasets
- Prototyping new data-driven algorithms for financial crime detection and compliance automation.
- Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
- Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platform.
Key Responsibilities.
Applied Research & Prototyping.
- Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
- Build experimental frameworks to test hypotheses using real-world financial datase
- Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.
Data Engineering for Research
- Develop data ingestion, transformation, and exploration pipelines to support experimentation.
- Work with structured, semi-structured, and unstructured datasets at scale.
- Ensure reproducibility and traceability of experiments
Algorithm Evaluation & Optimization.
- Evaluate research prototypes using statistical, ML, and domain-specific metrics.
- Optimize algorithms for accuracy, latency, and scalability.
- Conduct robustness, fairness, and bias evaluations on mode.
Collaboration & Integration
- Partner with data scientists to transition validated research outcomes into production-ready to code.
- Work closely with product managers to align research priorities with business goals
- Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.
Documentation & Knowledge Sharing
- Document experimental designs, results, and lessons learned
- Share best practices across engineering and data science teams to accelerate innovation
Qualifications and Skills
EducationRequired:
- Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
- Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field
Experience
- Minimum 4–7 years in data-centric engineering or applied research roles.
- Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
- Experience in financial services, compliance, or fraud detection is a strong plus.
Technical Expertise.
- Progra mming: Proficiency in Scala, Java, or Python
- Data Proce ssing: Experience with Spark, Hadoop, and Flink
- ML/Research Frame works: Hands-on with TensorFlow, PyTorch, or Scikit-learn
- Data bases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
- Cloud Plat forms: Experience with AWS (preferred) or GCP for research and data pipelines.
- Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
- Application Deploy ment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
- Streaming framew orks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
- Data Lakeh ouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.
Soft Skils
- SkillsStrong analytical and problem-solving abilities.
- Clear concise communication skills for cross-functional collaboration.
- Adaptability in fast-paced, evolving environments.
- Curiosity-driven with a bias towards experimentation and iteration.
Key Competencies
- Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
- Collab oration: Works effectively with researchers, engineers, and business stakeholders.
- Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
- Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
- Own ership: Drives research projects from concept to prototype to production.
- Adapta bility: Thrives in ambiguity and rapidly changing priorities.
- Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
- Experience in compliance or financial services domains.
Success Metrics
- Research to Production Co nversion: % of validated research projects integrated into TookiTaki’s platform
- Model Performan ce Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
- Efficiency of Research P ipelines: Reduced time from ideation to prototype completion.
- Collaboratio n Impact: Positive feedback from cross-functional teams on research integration.
Benefits
- Competiti ve Salary: Aligned with industry standards and experience.
- Professional De velopment: Access to training in big data, cloud computing, and data integration tools.
- Comprehensive Benefits: Health insurance and flexible working options.
- Growth Oppo rtunities: Career progression within Tookitaki’s rapidly expanding Services Delive ry team.
Introducing:
TookitakiTookitaki: The Trust Layer for Financial
Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)
compliance.How We Build Trust: Our Unique Value Propositions
- AFC Ecosystem – Community-Driven Financial Crime Protection
- The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
- protection.FinCense – End-to-End Compliance.
- PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.
Industry Recognition and Global Impact
Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis
RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.
Research Data Engineer
Posted today
Job Viewed
Job Description
Position Overview
Job Title: Software Development Engineer 2
Department: Technology
Location: Bangalore, India
Reporting To: Senior Research Manager - Data
Position Purpose
The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence.
The role exists to bridge research and engineering by
- Designing and executing experiments on large, complex datasets
- Prototyping new data-driven algorithms for financial crime detection and compliance automation.
- Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
- Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platform.
Key Responsibilities.
Applied Research & Prototyping.
- Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
- Build experimental frameworks to test hypotheses using real-world financial datase
- Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.
Data Engineering for Research
- Develop data ingestion, transformation, and exploration pipelines to support experimentation.
- Work with structured, semi-structured, and unstructured datasets at scale.
- Ensure reproducibility and traceability of experiments
Algorithm Evaluation & Optimization.
- Evaluate research prototypes using statistical, ML, and domain-specific metrics.
- Optimize algorithms for accuracy, latency, and scalability.
- Conduct robustness, fairness, and bias evaluations on mode.
Collaboration & Integration
- Partner with data scientists to transition validated research outcomes into production-ready to code.
- Work closely with product managers to align research priorities with business goals
- Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.
Documentation & Knowledge Sharing
- Document experimental designs, results, and lessons learned
- Share best practices across engineering and data science teams to accelerate innovation
Qualifications and Skills
EducationRequired:
- Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
- Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field
Experience
- Minimum 4–7 years in data-centric engineering or applied research roles.
- Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
- Experience in financial services, compliance, or fraud detection is a strong plus.
Technical Expertise.
- Programming: Proficiency in Scala, Java, or Python
- Data Processing: Experience with Spark, Hadoop, and Flink
- ML/Research Frameworks: Hands-on with TensorFlow, PyTorch, or Scikit-learn
- Databases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
- Cloud Platforms: Experience with AWS (preferred) or GCP for research and data pipelines.
- Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
- Application Deployment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
- Streaming frameworks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
- Data Lakehouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.
Soft Skils
- SkillsStrong analytical and problem-solving abilities.
- Clear concise communication skills for cross-functional collaboration.
- Adaptability in fast-paced, evolving environments.
- Curiosity-driven with a bias towards experimentation and iteration.
Key Competencies
- Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
- Collaboration: Works effectively with researchers, engineers, and business stakeholders.
- Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
- Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
- Ownership: Drives research projects from concept to prototype to production.
- Adaptability: Thrives in ambiguity and rapidly changing priorities.
- Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
- Experience in compliance or financial services domains.
Success Metrics
- Research to Production Conversion: % of validated research projects integrated into TookiTaki’s platform
- Model Performance Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
- Efficiency of Research Pipelines: Reduced time from ideation to prototype completion.
- Collaboration Impact: Positive feedback from cross-functional teams on research integration.
Benefits
- Competitive Salary: Aligned with industry standards and experience.
- Professional Development: Access to training in big data, cloud computing, and data integration tools.
- Comprehensive Benefits: Health insurance and flexible working options.
- Growth Opportunities: Career progression within Tookitaki’s rapidly expanding Services Delivery team.
Introducing:
TookitakiTookitaki: The Trust Layer for Financial
Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)
compliance.How We Build Trust: Our Unique Value Propositions
- AFC Ecosystem – Community-Driven Financial Crime Protection
- The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
- protection.FinCense – End-to-End Compliance.
- PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.
Industry Recognition and Global Impact
Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis
RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.
Research Data Engineer
Posted 2 days ago
Job Viewed
Job Description
Job Title: Software Development Engineer 2
Department: Technology
Location: Bangalore, India
Reporting To: Senior Research Manager - Data
Position Purpos e
The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence .
The role exists to bridge research and engineering by
Designing and executing experiments on large, complex datasets
Prototyping new data-driven algorithms for financial crime detection and compliance automation.
Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platfor m.
Key Responsibilities.
Applied Research & Prototyping.
Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
Build experimental frameworks to test hypotheses using real-world financial datase
Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.
Data Engineering for Research
Develop data ingestion, transformation, and exploration pipelines to support experimentation.
Work with structured, semi-structured, and unstructured datasets at scale.
Ensure reproducibility and traceability of experiments
Algorithm Evaluation & Optimization.
Evaluate research prototypes using statistical, ML, and domain-specific metrics.
Optimize algorithms for accuracy, latency, and scalability.
Conduct robustness, fairness, and bias evaluations on mode.
Collaboration & Integration
Partner with data scientists to transition validated research outcomes into production-ready to code.
Work closely with product managers to align research priorities with business goals
Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.
Documentation & Knowledge Sharing
Document experimental designs, results, and lessons learned
Share best practices across engineering and data science teams to accelerate innovation
Qualifications and Skills
EducationRequired:
Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field
Experience
Minimum 4–7 years in data-centric engineering or applied research roles.
Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
Experience in financial services, compliance, or fraud detection is a strong plus.
Technical Expertise.
Progra mming: Proficiency in Scala, Java, or Python
Data Proce ssing: Experience with Spark, Hadoop, and Flink
ML/Research Frame works: Hands-on with TensorFlow, PyTorch, or Scikit-learn
Data bases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
Cloud Plat forms: Experience with AWS (preferred) or GCP for research and data pipelines.
Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
Application Deploy ment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
Streaming framew orks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
Data Lakeh ouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.
Soft Skils
SkillsStrong analytical and problem-solving abilities.
Clear concise communication skills for cross-functional collaboration.
Adaptability in fast-paced, evolving environments.
Curiosity-driven with a bias towards experimentation and iteration.
Key Competencies
Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
Collab oration: Works effectively with researchers, engineers, and business stakeholders.
Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
Own ership: Drives research projects from concept to prototype to production.
Adapta bility: Thrives in ambiguity and rapidly changing priorities.
Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
Experience in compliance or financial services domains.
Success Metrics
Research to Production Co nversion: % of validated research projects integrated into TookiTaki’s platform
Model Performan ce Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
Efficiency of Research P ipelines: Reduced time from ideation to prototype completion.
Collaboratio n Impact: Positive feedback from cross-functional teams on research integration.
Benefits
Competiti ve Salary: Aligned with industry standards and experience.
Professional De velopment: Access to training in big data, cloud computing, and data integration tools.
Comprehensive Benefits: Health insurance and flexible working options.
Growth Oppo rtunities: Career progression within Tookitaki’s rapidly expanding Services Deliv e ry team.
Introducing:
TookitakiTookitaki: The Trust Layer for Financial
Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)
compliance.How We Build Trust: Our Unique Value Propositions
AFC Ecosystem – Community-Driven Financial Crime Protection
The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
protection.FinCense – End-to-End Compliance.
PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.
Industry Recognition and Global Impact
Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis
RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.
Research Data Manager
Posted today
Job Viewed
Job Description
Mission
Drive the development of intuitive, secure, and scalable front-end systems that bridge complex industrial data streams with actionable insights. Collaborate with IT and data teams to create user-friendly digital interfaces and architect cloud-based solutions that connect asset condition data (e.g. scan of equipment – wear measurement before/ after) with 3D scan analytics for predictive maintenance and measurable value to industrial customers.
This newly created role is key to internalizing capabilities currently outsourced to suppliers.
Calderys Group
Calderys is a leading global solution provider for industries operating in high temperature conditions . The Group specializes in thermal protection for industrial equipment with a wide range of refractory products, and advanced solutions to enhance steel casting, metallurgical fluxes and molding processes.
As an international business with a presence in more than 30 countries and a strong footprint in the Americas through the brand HWI, a member of Calderys, we offer our employees a world of opportunity.
With a legacy of over 150 years, and an unwavering commitment to excellence, we continue to shape our future through teamwork, customer-centricity and a proactive mindset. We are the vital partner of all high temperature industries and our purpose places sustainability and innovation at the heart of our business. It reflects our reason for existing: to support our customers building a better world through sustainable solutions.
Our values are a driving force in this purpose: We are tenacious, accountable, multicultural and authentic.
In our company, performance is recognized and learning is promoted. Our services and solutions depend upon the expertise and commitment of our employees. So we ensure that they have the scope and opportunities to develop their potential within a diverse, inclusive and collaborative setting. It is an environment for people to grow, where every day is a new day and more exciting than the last.
Calderys - Forged in legacy. Fueled by excellence.
For more information, please visit Calderys.com
Skills Required
Predictive Maintenance
Research Data Engineer
Posted today
Job Viewed
Job Description
Position Overview
Job Title: Software Development Engineer 2
Department: Technology
Location: Bangalore, India
Reporting To: Senior Research Manager - Data
Position Purpos e
The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence.
The role exists to bridge research and engineering by
- Designing and executing experiments on large, complex datasets
- Prototyping new data-driven algorithms for financial crime detection and compliance automation.
- Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
- Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platform.
Key Responsibilities.
Applied Research & Prototyping.
- Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
- Build experimental frameworks to test hypotheses using real-world financial datase
- Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.
Data Engineering for Research
- Develop data ingestion, transformation, and exploration pipelines to support experimentation.
- Work with structured, semi-structured, and unstructured datasets at scale.
- Ensure reproducibility and traceability of experiments
Algorithm Evaluation & Optimization.
- Evaluate research prototypes using statistical, ML, and domain-specific metrics.
- Optimize algorithms for accuracy, latency, and scalability.
- Conduct robustness, fairness, and bias evaluations on mode.
Collaboration & Integration
- Partner with data scientists to transition validated research outcomes into production-ready to code.
- Work closely with product managers to align research priorities with business goals
- Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.
Documentation & Knowledge Sharing
- Document experimental designs, results, and lessons learned
- Share best practices across engineering and data science teams to accelerate innovation
Qualifications and Skills
EducationRequired:
- Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
- Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field
Experience
- Minimum 4–7 years in data-centric engineering or applied research roles.
- Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
- Experience in financial services, compliance, or fraud detection is a strong plus.
Technical Expertise.
- Progra mming: Proficiency in Scala, Java, or Python
- Data Proce ssing: Experience with Spark, Hadoop, and Flink
- ML/Research Frame works: Hands-on with TensorFlow, PyTorch, or Scikit-learn
- Data bases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
- Cloud Plat forms: Experience with AWS (preferred) or GCP for research and data pipelines.
- Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
- Application Deploy ment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
- Streaming framew orks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
- Data Lakeh ouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.
Soft Skils
- SkillsStrong analytical and problem-solving abilities.
- Clear concise communication skills for cross-functional collaboration.
- Adaptability in fast-paced, evolving environments.
- Curiosity-driven with a bias towards experimentation and iteration.
Key Competencies
- Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
- Collab oration: Works effectively with researchers, engineers, and business stakeholders.
- Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
- Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
- Own ership: Drives research projects from concept to prototype to production.
- Adapta bility: Thrives in ambiguity and rapidly changing priorities.
- Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
- Experience in compliance or financial services domains.
Success Metrics
- Research to Production Co nversion: % of validated research projects integrated into TookiTaki’s platform
- Model Performan ce Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
- Efficiency of Research P ipelines: Reduced time from ideation to prototype completion.
- Collaboratio n Impact: Positive feedback from cross-functional teams on research integration.
Benefits
- Competiti ve Salary: Aligned with industry standards and experience.
- Professional De velopment: Access to training in big data, cloud computing, and data integration tools.
- Comprehensive Benefits: Health insurance and flexible working options.
- Growth Oppo rtunities: Career progression within Tookitaki’s rapidly expanding Services Delive ry team.
Introducing:
TookitakiTookitaki: The Trust Layer for Financial
Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)
compliance.How We Build Trust: Our Unique Value Propositions
- AFC Ecosystem – Community-Driven Financial Crime Protection
- The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
- protection.FinCense – End-to-End Compliance.
- PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.
Industry Recognition and Global Impact
Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis
RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.
Research Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Position Overview
Job Title: Software Development Engineer 2
Department: Technology
Location: Bangalore, India
Reporting To: Senior Research Manager - Data
Position Purpos e
The Research Engineer – Data will play a pivotal role in advancing TookiTaki’s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki’s competitive edge in fraud prevention, AML compliance, and data intelligence.
The role exists to bridge research and engineering by
- Designing and executing experiments on large, complex datasets
- Prototyping new data-driven algorithms for financial crime detection and compliance automation.
- Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions.
- Ensuring the robustness, fairness, and explainability of AI models within TookiTaki’s compliance platform.
Key Responsibilities.
Applied Research & Prototyping.
- Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model developments.
- Build experimental frameworks to test hypotheses using real-world financial datase
- Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows.
Data Engineering for Research
- Develop data ingestion, transformation, and exploration pipelines to support experimentation.
- Work with structured, semi-structured, and unstructured datasets at scale.
- Ensure reproducibility and traceability of experiments
Algorithm Evaluation & Optimization.
- Evaluate research prototypes using statistical, ML, and domain-specific metrics.
- Optimize algorithms for accuracy, latency, and scalability.
- Conduct robustness, fairness, and bias evaluations on mode.
Collaboration & Integration
- Partner with data scientists to transition validated research outcomes into production-ready to code.
- Work closely with product managers to align research priorities with business goals
- Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments.
Documentation & Knowledge Sharing
- Document experimental designs, results, and lessons learned
- Share best practices across engineering and data science teams to accelerate innovation
Qualifications and Skills
EducationRequired:
- Bachelor’s degree in Computer Science, Data Science, Applied Mathematics, or related field
- Preferred: Master’s or PhD in Machine Learning, Data Engineering, or a related research intensive field
Experience
- Minimum 4–7 years in data-centric engineering or applied research roles.
- Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications.
- Experience in financial services, compliance, or fraud detection is a strong plus.
Technical Expertise.
- Progra mming: Proficiency in Scala, Java, or Python
- Data Proce ssing: Experience with Spark, Hadoop, and Flink
- ML/Research Frame works: Hands-on with TensorFlow, PyTorch, or Scikit-learn
- Data bases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch).
- Cloud Plat forms: Experience with AWS (preferred) or GCP for research and data pipelines.
- Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases.
- Application Deploy ment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker Etc.
- Streaming framew orks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core
- Data Lakeh ouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong Plus.
Soft Skils
- SkillsStrong analytical and problem-solving abilities.
- Clear concise communication skills for cross-functional collaboration.
- Adaptability in fast-paced, evolving environments.
- Curiosity-driven with a bias towards experimentation and iteration.
Key Competencies
- Innovation Mindset: Ability to explore and test novel approaches that push boundaries in data analytics.
- Collab oration: Works effectively with researchers, engineers, and business stakeholders.
- Technical Depth: Strong grasp of advanced algorithms and data engineering principles.
- Problem Solving: Dives deep into the logs, metrics and code and identifying problems opportunities for performance tuning and optimization.
- Own ership: Drives research projects from concept to prototype to production.
- Adapta bility: Thrives in ambiguity and rapidly changing priorities.
- Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies.
- Experience in compliance or financial services domains.
Success Metrics
- Research to Production Co nversion: % of validated research projects integrated into TookiTaki’s platform
- Model Performan ce Gains: Documented improvements in accuracy, speed, or robustness from research initiatives.
- Efficiency of Research P ipelines: Reduced time from ideation to prototype completion.
- Collaboratio n Impact: Positive feedback from cross-functional teams on research integration.
Benefits
- Competiti ve Salary: Aligned with industry standards and experience.
- Professional De velopment: Access to training in big data, cloud computing, and data integration tools.
- Comprehensive Benefits: Health insurance and flexible working options.
- Growth Oppo rtunities: Career progression within Tookitaki’s rapidly expanding Services Delive ry team.
Introducing:
TookitakiTookitaki: The Trust Layer for Financial
Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering)
compliance.How We Build Trust: Our Unique Value Propositions
- AFC Ecosystem – Community-Driven Financial Crime Protection
- The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust
- protection.FinCense – End-to-End Compliance.
- PlatformOur FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention—from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale.
Industry Recognition and Global Impact
Tookitaki’s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis
RiskTech100.Serving some of the world’s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone.
Senior Team Leader, Research & Data Analysis
Posted today
Job Viewed
Job Description
Key Accountabilities and main responsibilities
Strategic Focus
- Lead transformation efforts, audits, and business continuity projects.
- Drive innovation, resilience, and adaptability within the team s operations.
- Partner with global stakeholders to align delivery with business goals and strategic direction.
- Drive standardization and consistency in data collection methodologies across asset classes.
- Support quality improvement through automation, workflow enhancements, and issue resolution frameworks.
- Encourage cross-training to minimize risk and reduce dependency, building a more agile and capable team.
Operational Management
- Review market announcements, trading volumes, and intelligence for in-depth analysis.
- Analyse large datasets to identify trends, patterns, and anomalies in shareholder transactions and behaviours.
- Oversee daily operations, including shift planning, floor-level support, and resource allocation.
- Monitor productivity and quality for both the team and individual contributions.
- Handle task prioritization, leave planning, and operational escalations.
- Track and report daily, weekly, and monthly operational metrics and compliance updates.
- Participate in calls with global teams and facilitate information flow between process owners and analysts.
People Leadership
- Manage a team of 10+ members, ensuring timely delivery, process adherence, and consistent upskilling.
- Conduct performance discussions, assign goals, and support career development paths.
- Provide coaching and feedback based on individual needs and business priorities.
- Encourage a high-morale, collaborative team culture through motivation and fair communication.
- Organize knowledge-sharing sessions and support group/individual training needs.
Governance & Risk
- Ensure adherence to internal controls, risk frameworks, and documentation protocols.
- Maintain audit-ready trails of all data validation and review processes.
- Identify operational risks and work with stakeholders to proactively mitigate them.
- Promote data confidentiality and compliance with external regulatory requirements.
The above list of key accountabilities is not an exhaustive list and may change from time-to-time based on business needs.
Experience & Personal Attributes
- Bachelor s/master s degree in finance, Business, Economics, or a related discipline.
- 7+ years of research or data analysis experience, including 3+ years of team management.
- Excellent analytical and problem-solving skills, with the ability to interpret complex data and provide actionable insights
- Hands-on experience in validating data from public domain sources such as filings, reports, and databases.
- Strong knowledge of operational workflow tools, documentation practices, and audit readiness.
- Excellent interpersonal, stakeholder management, and communication skills.
- Proficient in Excel; knowledge of VBA or system workflows is an added advantage.
- Ability to interpret data using tools like Excel, Power BI, or Tableau
- CFA (any level) is a plus.
- Detail-oriented, structured, and capable of working within dynamic and time-sensitive delivery environments.
- Team player with the ability to lead by example and adapt to evolving business needs.
Skills Required
Excel, Power Bi, Tableau, Vba, Audits, Strategic Direction
Be The First To Know
About the latest Research data Jobs in India !
Senior Team Leader, Research & Data Analysis
Posted today
Job Viewed
Job Description
Healthcare Research & Data Analyst
Posted today
Job Viewed
Job Description
About Clarivate
Who are you?
What will you do?
What do you know?
Requirements:
Skills:
Education:
Preferred (Good to have) skills:
Work Mode: Hybrid, Monday to Friday 12:00 pm to 8:00 pm
At Clarivate, we are committed to providing equal employment opportunities for all qualified persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations.
Human Research Data Analyst
Posted today
Job Viewed
Job Description
Keywords Studios , established in Dublin in 1998, now has 70+ studios across Europe, North and South America and Asia with 11,000 employee strength located across 5 continents and 23 countries. The company provides a complete outsourced game art, engineering, testing, audio and localization service for all Console, PC, Handheld and Mobile content, to many of the biggest names in games and interactive entertainment, working on thousands of titles including many of the best-selling titles of the past few years.
Keywords Studios is comprised of many individual brands, all with something unique to offer our clients. The studios are integrated into the Group by Service Line and use the operating systems and tools deployed by those services lines to ensure people and projects can operate across studios and across geographies.
For more info please refer to
Requirements
In this role your responsibilities will involve utilizing your expertise in AI to contribute to the development of optimized AI solutions.
Responsibilities:
Representing various clients’ software to provide enhanced AI solutions to boost their business productivity.
Create Use case scenarios derived from client solutions utilizing tailored AI solutions.
Execute delivering your thoughts on the opportunities for improvement.
Maintain and improve processes that support the creation of Use Cases.
Attend meetings as appropriate.
Independently identify operational inefficiencies and work to mitigate them.
Assist with other duties as needed.
Requirements:
Masters in Cognitive Science, Computer Science or other degree associated with AI
- Technical aptitude or experience working with AI
- Experience with Python, SQL, typescript (preferred)
- Data modeling (preferred)
- Experience working with and creating Data Visualizations
- Strong attention to detail
- Strong Organization skills
- Critical thinking and problem-solving skills
- Strong Analytical skills
- Process Improvement experience
- Strong aptitude of working with Google sheets, Zoom, and Slack
- Exemplify the quality of having a "Proactive Approach," attitude which includes a high level of accountability, transparency, and teamwork first & foremost
- Ability to learn on the job
Role Information: IN
Location: Asia Pacific
Studio: Keywords India
Area of Work: QA Testing Services
Service: Globalize
Employment Type: Full Time
Working Pattern: Work from Office
Benefits
- Cab Facility within Hiring Zones
- Medical Insurance, Term Insurance and Accidental Insurance
- Lunch / Dinner provided at subsidized rates