RoboPatient - Robot assisted learning of constrained haptic information gain

Lead Research Organisation: Imperial College London
Department Name: Design Engineering (Dyson School)

Abstract

Primary examination of a patient by a physician often involves physical examination to estimate the condition of internal organs in the abdomen. It is tacit knowledge among physicians that the skill to combine haptic cues with visual reaction feedback during physical examination of a patient is key to establish trust and to improve the accuracy of preliminary diagnosis. Clinical sessions in medical training involve either using real patients or static rubber mannequins. Both methods are opaque to some key information - such as how variation of physician's finger stiffness affects stress levels at organ boundaries. Moreover, feedback given in ward demonstrations is often qualitative. The ambition of this project is to investigate how a functional robotic patient can establish an efficient link between an expert demonstrator and a trainee physician to develop palpation skills that the two human counterparts cannot establish using a real patient.

In this project, we introduce the first generation of a functional robotic patient with sensorised and controllable internal organs together with a detailed finite element-based abdominal tissue model to visualise enhanced sensor data. The human-robot interaction will be enriched using augmented reality based facial reactions during physical examination of the robotic patient. This will help trainee physicians to derive deeper insights into the demonstrations from a tutor. A robotic patient would also allow us to quantify and reinforce established techniques - for example starting distant regions to "calibrate" the peripheral sensory system, while learning what represents optimal approaches - such as pressing just enough with the right frequency. The experiential feedback loop will, therefore, allow us to quantify the improvement.

The main scientific outcome of this project will be a deeper understanding of how an expert physician could use a robot-assisted approach to allow a trainee physician to efficiently acquire the skill of manual palpation subject to visually perceived constraints. First trials will be done at the University of Surrey, with a subsequent introduction to other General Practitioner trainers and trainees through Royal College of General Practitioners workshops. Student and tutor feedback from pilot trials will be used to improve the robo-patient design in a user-centred co-design framework.

Planned Impact

Academic impact: Unconstrained haptic information gain during manual palpation often involves indentation and velocity control of fingers accompanied by joint stiffness regulation. The findings in constrained haptic information gain during physical examination of soft tissue will shed new light on how human prioritises behavioural strategies to gain haptic perception. This will take research and teaching on physical examination of patients in a new path. We will also contribute new technologies in sensorised and controllable soft robotic phantoms, soft robotic faces to express emotions with different culture and gender combinations, and augmented reality-based real-time data visualisation techniques. This project will also make new findings in human motor learning based on demonstrations involving both external behavioural data like finger movement, tissue indentation, examiner's gaze control, and facial expressions of the patient, as well as internal variables like lower arm muscle co-contraction. So far learning based on demonstration has been largely limited to external behaviours that can be directly observed. However, visualisation of internal variables can lead to behavioural innovations in addition to just mimicking the demonstration. Understanding the factors in the demonstration that lead to such different types of learning will introduce new possibilities for clinical training of general practitioners and other disciplines that can benefit from demonstration-based learning.

Other scientific engagements: In addition to the main areas of scientific contributions, we will bring together a diverse group of scientists in psychology, behavioural science, education theory, as well as soft robotics to explore questions about using robots to assist education that involves gender and culture implications of human-robot interaction in broad application domains such as homes, cities, and medicine.

Industrial and clinical impact: There is an increasing demand for soft robotic simulators in clinical training, surgery, museums, tele-shopping, games, and aviation. The scientific and pedagogical contributions to introduce new ways to improve constrained haptic information gain during manual palpation of soft tissue will underpin a transformation in these application domains.

Social impact: The project will contribute to a broad social discussion on the role of soft robotics in medical education and in our daily lives by engaging students and robot enthusiasts in scientific exhibitions, activities of the Imperial Robotic Forum, related activities of the Centres of Doctoral Training in the three institutes, information on our websites, and meetings via the London Robotics Network. The main focus of these discussions will be the role of robots to reduce using patients as test subjects in medical education.

Impact on students and PDRAs: The postgraduate students associated with this project and the PDRAs who will be directly employed will have the opportunity to improve public engagement and teaching skills by leading workshops in major scientific events in the area of soft robotics, human-robot interaction, and general public engagement events. We have budgeted for their travel to major international conferences to promote their visibility and to facilitate networking. Following our previous experiences, we have planned to integrate technological innovations in this project to formal teaching practice of robotics, neuroscience, and general practitioner training.

Open access data: Following positive experiences of releasing open access data based on previous EPSRC funded projects, we will release anonymised physical examination behavioural data with proper guides to use them for research purposes. This will engage a broad academic community interested in related research areas.

Publications

10 25 50
 
Description 1. We developed a robotic face that can show the same level of pain from different gender and ethnic backgrounds and asked human participants to rate the pain level expressed by the robotic face. Our initial experimental results showed that gender and ethnicity interaction between the patient and the examiner matters in the process of interpreting the visual feedback. This finding has been a basis for us to further study if the robotic patient can be used to train physicians to be impartial across different gender and ethnic backgrounds of the patient.
2. We developed a hardware robotic patient that has controllable and sensorised internal organs to present different physiological conditions and monitor physical examination behaviours of a trainee. Experiments show that it allows an examiner to use bare hands to examine the robotic patient while receiving haptic as well as visual feedback of pain expressions from the patient. We are now in the process of extending it to a mixed reality platform that can be scaled to allow participants to experience the patient remotely.
3. Experiments with human participants helped to identify the best revealed the best order in which different regions of a robotic face, known as facial activation units (AUs), must trigger to produce the most accurate expression of pain.
4. Experiments with human participants helped to identify that the perception of pain expressed in a robotic face depends on specific gender and culture interactions between the examiner and the robotic patient.
5. We developed a first generation surrogate model of abdominal tissue using deep neural networks trained using finite element methods simulations.
Exploitation Route The outcomes of this project will lead to a mixed reality patient model that can be used to teach anatomy and physical examination to medical students and physicians in an online platform. It will lead to an emersive experience to learn physical examination skills by combining touch, vision, and auditory feedback during a physical examination scenario. We have already started to trial with trainee physicians and medical students at the Radcliffe hospital, Oxford University.
Sectors Digital/Communication/Information Technologies (including Software),Education,Healthcare

URL https://www.imperial.ac.uk/morph-lab/research/robopatient-project/
 
Description The robotic interventions have been tested on medical students. They results show that our intervention helps medical students to normalise their perceptual biases towards different sex and culture backgrounds of patients within about 1-hour of engagement with the robo-patient. The results have also been demonstrated in public events such as Imperial Lates.
First Year Of Impact 2022
Sector Education,Healthcare
Impact Types Cultural,Societal

 
Title Dataset for Conditioned haptic perception for 3D localization of nodules in soft tissue palpation with a variable stiffness probe 
Description These data are complementing the following publication:[1] N. Herzig, L. He, P. Maiolino, S-A Abad, and T. Nanayakkara, Conditioned Haptic Perception for 3D localization of Nodules in Soft Tissue Palpation with a Variable Stiffness Probe. PLoS One. DOI: 10.1371/journal.pone.0237379These data support our research on a Variable Stiffness Palpation Probe and its control strategy to palpate and detect the location of stiff inclusions in soft tissues.The folder contains a ReadMe file and a binary Matlab file. For more details about the content of the binary file and the data structure, please read the ReadMe file. 
Type Of Material Database/Collection of data 
Year Produced 2020 
Provided To Others? Yes  
URL https://figshare.shef.ac.uk/articles/dataset/Dataset_for_Conditioned_haptic_perception_for_3D_locali...
 
Title Dataset for Conditioned haptic perception for 3D localization of nodules in soft tissue palpation with a variable stiffness probe 
Description These data are complementing the following publication:[1] N. Herzig, L. He, P. Maiolino, S-A Abad, and T. Nanayakkara, Conditioned Haptic Perception for 3D localization of Nodules in Soft Tissue Palpation with a Variable Stiffness Probe. PLoS One. DOI: 10.1371/journal.pone.0237379These data support our research on a Variable Stiffness Palpation Probe and its control strategy to palpate and detect the location of stiff inclusions in soft tissues.The folder contains a ReadMe file and a binary Matlab file. For more details about the content of the binary file and the data structure, please read the ReadMe file. 
Type Of Material Database/Collection of data 
Year Produced 2020 
Provided To Others? Yes  
URL https://figshare.shef.ac.uk/articles/dataset/Dataset_for_Conditioned_haptic_perception_for_3D_locali...