RoboPatient - Robot assisted learning of constrained haptic information gain

Lead Research Organisation: Imperial College London
Department Name: Design Engineering (Dyson School)

Abstract

Primary examination of a patient by a physician often involves physical examination to estimate the condition of internal organs in the abdomen. It is tacit knowledge among physicians that the skill to combine haptic cues with visual reaction feedback during physical examination of a patient is key to establish trust and to improve the accuracy of preliminary diagnosis. Clinical sessions in medical training involve either using real patients or static rubber mannequins. Both methods are opaque to some key information - such as how variation of physician's finger stiffness affects stress levels at organ boundaries. Moreover, feedback given in ward demonstrations is often qualitative. The ambition of this project is to investigate how a functional robotic patient can establish an efficient link between an expert demonstrator and a trainee physician to develop palpation skills that the two human counterparts cannot establish using a real patient.

In this project, we introduce the first generation of a functional robotic patient with sensorised and controllable internal organs together with a detailed finite element-based abdominal tissue model to visualise enhanced sensor data. The human-robot interaction will be enriched using augmented reality based facial reactions during physical examination of the robotic patient. This will help trainee physicians to derive deeper insights into the demonstrations from a tutor. A robotic patient would also allow us to quantify and reinforce established techniques - for example starting distant regions to "calibrate" the peripheral sensory system, while learning what represents optimal approaches - such as pressing just enough with the right frequency. The experiential feedback loop will, therefore, allow us to quantify the improvement.

The main scientific outcome of this project will be a deeper understanding of how an expert physician could use a robot-assisted approach to allow a trainee physician to efficiently acquire the skill of manual palpation subject to visually perceived constraints. First trials will be done at the University of Surrey, with a subsequent introduction to other General Practitioner trainers and trainees through Royal College of General Practitioners workshops. Student and tutor feedback from pilot trials will be used to improve the robo-patient design in a user-centred co-design framework.

Planned Impact

Academic impact: Unconstrained haptic information gain during manual palpation often involves indentation and velocity control of fingers accompanied by joint stiffness regulation. The findings in constrained haptic information gain during physical examination of soft tissue will shed new light on how human prioritises behavioural strategies to gain haptic perception. This will take research and teaching on physical examination of patients in a new path. We will also contribute new technologies in sensorised and controllable soft robotic phantoms, soft robotic faces to express emotions with different culture and gender combinations, and augmented reality-based real-time data visualisation techniques. This project will also make new findings in human motor learning based on demonstrations involving both external behavioural data like finger movement, tissue indentation, examiner's gaze control, and facial expressions of the patient, as well as internal variables like lower arm muscle co-contraction. So far learning based on demonstration has been largely limited to external behaviours that can be directly observed. However, visualisation of internal variables can lead to behavioural innovations in addition to just mimicking the demonstration. Understanding the factors in the demonstration that lead to such different types of learning will introduce new possibilities for clinical training of general practitioners and other disciplines that can benefit from demonstration-based learning.

Other scientific engagements: In addition to the main areas of scientific contributions, we will bring together a diverse group of scientists in psychology, behavioural science, education theory, as well as soft robotics to explore questions about using robots to assist education that involves gender and culture implications of human-robot interaction in broad application domains such as homes, cities, and medicine.

Industrial and clinical impact: There is an increasing demand for soft robotic simulators in clinical training, surgery, museums, tele-shopping, games, and aviation. The scientific and pedagogical contributions to introduce new ways to improve constrained haptic information gain during manual palpation of soft tissue will underpin a transformation in these application domains.

Social impact: The project will contribute to a broad social discussion on the role of soft robotics in medical education and in our daily lives by engaging students and robot enthusiasts in scientific exhibitions, activities of the Imperial Robotic Forum, related activities of the Centres of Doctoral Training in the three institutes, information on our websites, and meetings via the London Robotics Network. The main focus of these discussions will be the role of robots to reduce using patients as test subjects in medical education.

Impact on students and PDRAs: The postgraduate students associated with this project and the PDRAs who will be directly employed will have the opportunity to improve public engagement and teaching skills by leading workshops in major scientific events in the area of soft robotics, human-robot interaction, and general public engagement events. We have budgeted for their travel to major international conferences to promote their visibility and to facilitate networking. Following our previous experiences, we have planned to integrate technological innovations in this project to formal teaching practice of robotics, neuroscience, and general practitioner training.

Open access data: Following positive experiences of releasing open access data based on previous EPSRC funded projects, we will release anonymised physical examination behavioural data with proper guides to use them for research purposes. This will engage a broad academic community interested in related research areas.

Publications

10 25 50

publication icon
Tan X (2021) A Soft Pressure Sensor Skin to Predict Contact Pressure Limit Under Hand Orthosis in IEEE Transactions on Neural Systems and Rehabilitation Engineering

publication icon
Nanayakkara T (2021) Robotics: Science and Systems (RSS) 2020 in The International Journal of Robotics Research

publication icon
Lalitharatne TD (2022) Face mediated human-robot interaction for remote medical examination. in Scientific reports

publication icon
Lalitharatne T (2021) MorphFace: A Hybrid Morphable Face for a Robopatient in IEEE Robotics and Automation Letters

publication icon
Labazanova L (2023) Self-Reconfigurable Soft-Rigid Mobile Agent With Variable Stiffness and Adaptive Morphology in IEEE Robotics and Automation Letters

publication icon
He L (2023) Robotic Simulators for Tissue Examination Training With Multimodal Sensory Feedback. in IEEE reviews in biomedical engineering

publication icon
Hamid E (2021) A State-Dependent Damping Method to Reduce Collision Force and Its Variability in IEEE Robotics and Automation Letters

 
Description 1. We developed a robotic face that can show the same level of pain from different gender and ethnic backgrounds and asked human participants to rate the pain level expressed by the robotic face. Our initial experimental results showed that gender and ethnicity interaction between the patient and the examiner matters in the process of interpreting the visual feedback. This finding has been a basis for us to further study if the robotic patient can be used to train physicians to be impartial across different gender and ethnic backgrounds of the patient.
2. We developed a hardware robotic patient that has controllable and sensorised internal organs to present different physiological conditions and monitor physical examination behaviours of a trainee. Experiments show that it allows an examiner to use bare hands to examine the robotic patient while receiving haptic as well as visual feedback of pain expressions from the patient. We are now in the process of extending it to a mixed reality platform that can be scaled to allow participants to experience the patient remotely.
3. Experiments with human participants helped to identify the best revealed the best order in which different regions of a robotic face, known as facial activation units (AUs), must trigger to produce the most accurate expression of pain.
4. Experiments with human participants helped to identify that the perception of pain expressed in a robotic face depends on specific gender and culture interactions between the examiner and the robotic patient.
5. We developed a first generation surrogate model of abdominal tissue using deep neural networks trained using finite element methods simulations.
Exploitation Route The outcomes of this project will lead to a mixed reality patient model that can be used to teach anatomy and physical examination to medical students and physicians in an online platform. It will lead to an emersive experience to learn physical examination skills by combining touch, vision, and auditory feedback during a physical examination scenario. We have already started to trial with trainee physicians and medical students at the Radcliffe hospital, Oxford University.
Sectors Digital/Communication/Information Technologies (including Software)

Education

Healthcare

URL https://www.imperial.ac.uk/morph-lab/research/robopatient-project/
 
Description The robotic interventions have been tested on medical students. They results show that our intervention helps medical students to normalise their perceptual biases towards different sex and culture backgrounds of patients within about 1-hour of engagement with the robo-patient. The results have also been demonstrated in public events such as Imperial Lates. I secured a UKRI impact acceleration award (EP/X52556X/1) titled "A portable soft haptic mouse (PSHM) to enable tactile interaction with digital objects" to launch a start-up company.
First Year Of Impact 2023
Sector Education,Healthcare
Impact Types Cultural

Societal

Economic

 
Description Written evidence with my Views on a New UK Funding Agency for Science and Technology - ARIA
Geographic Reach National 
Policy Influence Type Contribution to a national consultation/review
Impact It was directed at the launch of ARIA
URL https://committees.parliament.uk/writtenevidence/6850/pdf/
 
Title Dataset for Conditioned haptic perception for 3D localization of nodules in soft tissue palpation with a variable stiffness probe 
Description These data are complementing the following publication:[1] N. Herzig, L. He, P. Maiolino, S-A Abad, and T. Nanayakkara, Conditioned Haptic Perception for 3D localization of Nodules in Soft Tissue Palpation with a Variable Stiffness Probe. PLoS One. DOI: 10.1371/journal.pone.0237379These data support our research on a Variable Stiffness Palpation Probe and its control strategy to palpate and detect the location of stiff inclusions in soft tissues.The folder contains a ReadMe file and a binary Matlab file. For more details about the content of the binary file and the data structure, please read the ReadMe file. 
Type Of Material Database/Collection of data 
Year Produced 2020 
Provided To Others? Yes  
URL https://figshare.shef.ac.uk/articles/dataset/Dataset_for_Conditioned_haptic_perception_for_3D_locali...
 
Description Collaboration with University of Oxford on AI based virtual patients 
Organisation University of Oxford
Department Oxford Hub
Country United Kingdom 
Sector Academic/University 
PI Contribution I coordinated a new EU grant proposal with partners from Oxford.
Collaborator Contribution Partners from the Institute of Biomedical Engineering, University of Oxford, contributed to a work package on AI based virtual patients as part of a new robotic patient to train medical students on physical examination.
Impact Submitted an EU grant proposal for review.
Start Year 2023
 
Description Collaboration with the medical partners at the Hammersmith Hospital 
Organisation Hammersmith Hospital
Country United Kingdom 
Sector Hospitals 
PI Contribution We provided the new robotic patient to be tested on medical students, and conducted experiments along with clinical teachers.
Collaborator Contribution The clinical collaborators at the Hammersmith hospital helped to recruit medical students for human participant experiments, and gave valuable clinical insights to test the new robotic patient to train physical examination skills in medical students. They also partnered to write a new EU grant proposal.
Impact We have submitted a new EU grant proposal with 2 more EU partners. The clinical collaborators from Hammersmith hospital made valuable contributions to write a work package.
Start Year 2021
 
Description Collaboration with the medical simulation centre, Faculty of Medicine of Masaryk University 
Organisation Masaryk University
Country Czech Republic 
Sector Academic/University 
PI Contribution I coordinated an EU grant proposal with partners from the Faculty of Medicine of Masaryk University
Collaborator Contribution The partners from Faculty of Medicine of Masaryk University invited me for a visit to see the new medical simulation centre and made valuable contributions to a work package in a new EU grant proposal.
Impact A new EU grant proposal submitted for review in February 2024.
Start Year 2023
 
Description A keynote speech to the staffers of honourable MPs in the Sri Lankan parliament on "broad trends in AI and robotics" 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Policymakers/politicians
Results and Impact I was invited to give a keynote on "broad trends in AI and robotics - how to prepare to benefit from the economic wave" to the senior staffers of the honourable MPs of the Sri Lankan parliament. I also published the speech on Youtube and Linkedin.
Year(s) Of Engagement Activity 2024
URL https://thrishantha.blogspot.com/2024/02/a-formula-to-ignite-youth-tech-start.html
 
Description Boston Action Club, Invited talk titled A soft robotics approach to understand biological secrets of complex interactions 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact This audience was from robotics groups in Northeastern University, Harvard, MIT, and the Boston University. In my keynote, I focused on the new results to show that a robotics approach can be taken to understand how two types of control commands should work in concurrency to get robots to be efficient in natural environments. They are kinematic tuning commands and force/motion control commands. I presented the results explained in the upcoming Springer Nature Handbook on Soft Robotics.
Year(s) Of Engagement Activity 2023
URL https://actionlab.sites.northeastern.edu/action-club/
 
Description Chaired a keynote session on Morphological Computation at IEEE International Conference on Robotics and Automation (ICRA 2023) 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact I chaired a keynote session on Morphological Computation at ICRA23, the World's largest international conference in robotics. I mentioned in my Linkedin post that "The idea that physical structures can to a certain extent interact with their environment on their own without sensors or computers has been a fascinating area for me. Theo Janssen's walking sculptures on beaches is one example. The wind and the beach is all what the sculpture needs to walk like an animal. How Jamie Paik's modular origami inspired robots can scale up to exhibit complex behaviours was mind boggling. Fumiya Iida's robots printing their own extensions such as spoons to perform tasks better was fascinating."
Year(s) Of Engagement Activity 2023
URL https://www.linkedin.com/posts/thrishantha_i-enjoyed-chairing-the-morphological-computation-activity...
 
Description Harvard University, Invited talk titled A soft robotics approach to understand biological secrets of complex interactions 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Harvard University, Invited talk titled A soft robotics approach to understand biological secrets of complex interactions. This talk was given to roboticists at the John Paulson School of Engineering and Applied Sciences. There were special invitees from Harvard Kennedy School of Government too.
Year(s) Of Engagement Activity 2023
 
Description IIEEE IAS GlobConET 2023, Keynote speech titled A robotics approach to understand how the brain tunes the body to solve dynamic interaction problems 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact IEEE IAS GlobConET 2023, Keynote speech titled "A robotics approach to understand how the brain tunes the body to solve dynamic interaction problems". This was based on results from human participant experiments in the MOTION and robo-patient projects.
Year(s) Of Engagement Activity 2023
URL https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10149910