Pain rehabilitation: E/Motion-based automated coaching

Lead Research Organisation: University College London
Department Name: UCL Interaction Centre

Abstract

Almost 1 in 7 UK citizens suffer from chronic pain, much of which is mechanical lower back pain with no treatable pathology. Pain management rehabilitative programs suffer from two shortcomings: (a) there are too few resources in the health care system to treat all patients face-to-face; (b) current approaches fail to integrate treatment of interrelated physiological and psychological factors. Combining expertise from engineering, clinical health sciences, and industry through a multidisciplinary team of investigators and advisors, this proposal seeks to address both shortcomings by (a) developing a set of methods for automatically recognising audiovisual cues associated with pain, behavioural patterns typical of pain, and affective states influencing pain and activity, and (b) integrating these methods into an interactive computer system that will provide appropriate feedback and prompts to the patient based on his/her behaviour measured during self-directed exercise and fitness-building sessions. This intelligent system will enable and motivate patients to continue their progress in extending activity inside (and in the longer term, outside) the clinical environment and thus will facilitate their reintegration into social and working life.In doing so, the project aims to make major contributions in a number of research areas. First, the project will significantly expand the state of the art in the field of emotion recognition by extending current methods in affective body gesture recognition, facial expression recognition and affective vocalisation recognition to deal with naturalistic rather than acted emotional expressions. This entails theoretical and practical contributions to important challenges such as detection and tracking of human behavioural cues in real world unconstrained environments, spatiotemporal analysis of complex dynamic stimuli such as non-linguistic vocalisations, facial expressions, and body movements, and spatiotemporal fusion of multimodal data streams in which constraints of synchronisation are relaxed. Second, the project will advance our understanding of how affect and pain-related moods such as fear, depression, and frustration in particular, interact with motor behaviour to not only modify pain expressions, but also to produce guarded movements that can exacerbate pain or worsen disability. This, in turn, will contribute to the body of work on cognitive behavioural models of pain to improve the diagnosis and management of pain behaviours. Finally, the project will contribute a novel user-centred approach to developing patients' understanding of their body movement during self-directed therapy; it will also identify what type of feedback best promotes engagement and encourages persistence, and offsets the negative effects of pain-related anxiety, frustration, and low-mood. Patients, clinicians and members of the advisory team will be periodically involved in the design of the interface and the testing of the system through design workshops. The system will eventually be made available for long term testing to the Pain Management Centre of the National Hospital of Neurology and Neurosurgery, London.
 
Description 1) A multifaced first of a kind dataset has been acquired and compiled to underpin the design of much needed affect-aware physical rehabilitaiton systems for chronic lower back pain. The corpus contains facial expressions (multiple high level definition video cameras), voice, full body movement data (17 gyros), muscle activity (from EMG) gathered from people with chronic pain and healthy participants during physical rehabilitation activity. This corpus will be made available to the research community (upon request) as soon as full clearance is obtained (Aung et al., 2016).

2) Automatic recognition models of perceived pain levels (Olugbade et al., 2014,2015) and, pain-related behaviour (Aung et al., 2013, 2014, 2016) have been proposed and tested on the above dataset using body movement data and muscle activity data. Such algorithms can be used to personalize the run-time support that full-body sensing technology can provide to people with chronic pain during physical rehabilitation. The work has been extended to detect self-efficacy in movement (Olugbade et al., 2018).

3) New algorithms (Romera-Paredes et al., 2013-2014) based on the multi task learning paradigm have been proposed and have contributed to the issue of person specific effects (idiosyncrasy) typical of datasets containing data from various subjects expressing emotional states in their unique ways. This is particular important when dealing with limited sample from each user (e.g., a new user using the system)

4) An extended set of qualitative studies with people with chronic pain (at the Pain Centre at National Hospital for Neurology and Neurosurgery (NHNN) and in the homes) and physiotherapists has led to a better understanding of people's needs and strategies to facilitate physical activity despite pain (Singh et al., 2014). These findings have led to reconsider the role of technology in chronic pain physical rehabilitation. We have proposed that, whereas in acute pain and induced pain, technology is being used to provide fun and facilitate attention away from pain, in chronic pain technology needs to help people to overcome also and first of all their psychological barriers. We have proposed new personalized sonification mechanisms to help people overcome the psychological barriers they meet while engaging in physical activity and when pain is a constant threat. Sonification is here designed as a way to increase patients' awareness of movement (devoid of pain) and related processes (breathing) rather than simply correct movement. The aim is to increase confidence and self-efficacy.Through body movement and sound, patients can design calibrated aural space within which they can exercise and explore their body. Results showed increase awareness and confidence in movement and increased perceived self-efficacy (Singh et al., 2015).

5) We have currently designed 3 platforms

(a) full-body sensing (motion capture) platform for an in-depth study of movement in pain;
(b) a Kinect-based application (in collaboration with the University of Genoa) for a more coarse detection of body movement; and
(c) finally a wearable smartphone-based device with a minimal number of sensors (Singh et al., 2014; Digital Health Prize 2nd prize (judged as the most innovative app - the 1st prize went to ready to deploy apps).

All three platforms integrate movement and breathing sensors (designed by my team) with the first two also including EMG sensors. A key aim of the smartphone-based device is to bring physical rehabilitation away from mere physical activity sessions toward daily life functioning by facilitating awareness and management of body physical and emotional resources. We are currently extending the wearable device with a larger set of movement sensors and EMG to facilitate not only exercising but also everyday functioning.

6) Investigated the use of the designed technology to support not only exercise but also functional activity in the home. A qualitative study with people with chronic pain over 10 days shows how the sonification and calibration framework designed for exercise indeed help people to develop strategies to facilitate everyday activity and apply self-management strategies to function in their social role. The results were accepted for publication at CHI'17

7) Developed a low-cost reduced sensor network wearable prototype for pain, confidence and anxiety detection (based on body movement and muscle activity sensors) in exercise and functioning activity. The prototype is used to understand automatic detection in everyday life. The work is under review in at IEEE Transactions on Affective Computing journal.
Exploitation Route 1. Our dataset will be released (as soon as full clearance is obtained) to foster new research in the pain bevahiour detection from face expression, body motion, emg and voice. Currently open only to collaborators
We are also making aspects of our software available to others. Ethical approval has been obtained to open the sonification app to the public. This will be our next step.

2. We are also applying for more funding to bring the current wearable to closer to commercialization.

3. The sonification framework is being tested in a novel context: facilitate body movement control in children with autism (collaboration with CICESE, Mexico)

4. The work has informed the design of affect-aware technology for stroke
Sectors Digital/Communication/Information Technologies (including Software),Healthcare

URL http://www.emo-pain.ac.uk
 
Description - Clinicians specialized in other conditions than musculoskeletal chronic pain has seen the potential for adopting the technology to treat other conditions. Very preliminary assessment of the use of the Go-with-the-flow device has been carried out with people with CRPS at the Royal National Orthopedic Hospital with positive feedback from patients. We are discussing applying for a grant to extend the design of the device to this population. - Following their involvement in our studies, clinicians supporting our work have published a journal paper Denneny, D., Frijdal, A., Berthouze, N., Greenwod, J., McLoughlin, R., Petersen, K., . . . Williams, A. (2019). The application of Psychologically Informed Practice: Observations of experienced physiotherapists working with people with chronic pain. Australian Journal of Physiotherapy. doi:10.1016/j.physio.2019.01.014 - Various companies have shown interest in integrating the device with their software for chronic pain. - The Go-with-the-Flow device was also used during workshops with the theater producer Rachel Bagshaw during the creation of the theatrical piece (National Theatre Studio) on chronic pain: "Where we are, where we are going". The aim of the theatrical piece is to facilitate understanding of what chronic pain is and what it means to live with chronic pain. - The sonification framework is currently being tested and extended in the context of studies for helping children with autism learn and control their body movement (in collaboration with CICESE, Mexico)
First Year Of Impact 2015
Sector Digital/Communication/Information Technologies (including Software),Healthcare
Impact Types Societal,Economic

 
Description EnTimeMent - ENtrainment and synchronization at multiple TIME scales in the MENTal foundations of expressive gesture
Amount € 4,188,886 (EUR)
Funding ID GA824160 
Organisation European Commission H2020 
Sector Public
Country Belgium
Start 01/2019 
End 12/2022
 
Description HUMAN: FOF-04-2016 - Continuous adaptation of work environments with changing levels of automation in evolving production systems - as coI
Amount € 399,135,495 (EUR)
Funding ID 723737 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 10/2016 
End 09/2019
 
Description RosetreesTrust
Amount £16,000 (GBP)
Organisation Rosetrees Trust 
Sector Charity/Non Profit
Country United Kingdom
Start 04/2014 
End 03/2016
 
Description UCL Enterprise Scholarship
Amount £4,000 (GBP)
Organisation University College London 
Department UCL Advances
Sector Academic/University
Country United Kingdom
Start 10/2015 
End 12/2015
 
Description Ubi-Health Project funded by EC Marie Curie IRSES Program
Amount € 814,800 (EUR)
Organisation European Commission 
Department Seventh Framework Programme (FP7)
Sector Public
Country European Union (EU)
Start 09/2014 
End 08/2016
 
Title Emo-Pain Corpus 
Description Dataset of chronic pain and healthy subjects undergoing typical movement exercises with pain related behaviour labels. Four synchronised sensing modalities: motion capture, EMG, facial video and acoustics. 
Type Of Material Database/Collection of data 
Provided To Others? No  
Impact This multifacted dataset will serve to foster research in chronic pain related behaviour mostly within the affective computing community and machine learning community As full clearance will be obtained this will be open to the full research community. 
 
Title Automatic detection of pain levels from body movement and muscle activity in physical rehabilitation 
Description The software extracts kinematics and muscle activity features from data captured by movement and EMG sensors and predict pain level (3 levels) during physical rehabilitation exercise. It can be used to provide personalized support to people with chronic pain while they are engaged in physical rehabilitation. The description of the features and of the models are provided in Olugbade et al., (2014, 2015). The work has been extended to detect self-efficacy in movement (Olugbade et al., 2018). We are currently refining it to be able to process information from a wearable device based on cheaper and more ready available sensors to be deployed in the home (see Olugbade et al, 2018). 
Type Of Technology Software 
Year Produced 2015 
Impact Not yet used in the domain but led to interest from industry and connection with other research groups to further advance it. 
 
Title Automatic recognition of pain-related behaviour during physical rehabilitation 
Description The software extracts kinematics and muscle activity features from data captured by movement and EMG sensors and detect chronic pain-related behavior during a set of specific physical rehabilitation exercises. It can be used to personalize the support needed by the patient during physical rehabilitation. The description of the features and the models is provided in the related publications (Aung et al., 2013-2016) at www.emo-pain.ac.uk. We are now improving the software to use cheaper and more easily available sensors. This version will be open-source. 
Type Of Technology Software 
Year Produced 2014 
Impact The software is not yet used in the domain but it has led to new interest by companies and other researcher groups. 
 
Title Go-with-the-flow 
Description Go-with-the-flow is a wearable Smartphone app that senses movement and related physiological processes (e.g., breathing) and transforms these signals into sound. Smartphone sensors track movement and Arduino-based respiration chest belts detect breathing patterns related to anxiety. It was developed as part of my PhD on the Emotion and Pain project. The device has been designed from user studies with people with chronic pain and with physiotherapists and psychologists (Singh et al., 2014) and validated through control studies and home studies. The description of the device is described in related publications (Singh et al., 2015). 
Type Of Technology Webtool/Application 
Year Produced 2014 
Impact Go-with-the flow was hailed as the most innovative app at the Festival of Digital Health, 2014 and we won the Runners Up prize. The event celebrated the cutting edge of health apps in the fields of gamification and self-tracking and included a live showcase of apps and games demonstrating innovation from UCL academics and SMEs. More details here. http://www.fdh.ucl.ac.uk/event/gamification-self-tracking-health-wellbeing/ Subsequently I also won people's choice award for presenting our work at the Grace Hopper London Colloquium 2015. http://academy.bcs.org/content/london-hopper-colloquium. We were also finalists at the Social Innovators Challenge that was orgainsed and hosted by Healthbox, UCLB and Numbers4Good, in collaboration with Janssen Healthcare Innovation and Trafford Housing Trust, with support from the Cabinet Office http://www.healthsocialinnovators.org/ 
 
Title Software publicly available from my emopain papers 
Description Software publicly available from my emopain papers now is https://github.com/bernard24/ConvexTensor . Particularly that related to the paper "A New Convex Relaxation for Tensor Completion". 
Type Of Technology Software 
Year Produced 2013 
Open Source License? Yes  
Impact Other researchers have built on this to implement their own approaches, and to compare to this one. 
URL https://github.com/bernard24/ConvexTensor
 
Description Big Bang Fair 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Schools
Results and Impact The fairs are part of a UK-wide programme led by EngineeringUK to bring science and engineering to life for young people.
Year(s) Of Engagement Activity 2015
 
Description Bright Club stand up comedy set 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Public/other audiences
Results and Impact The set was done to engage audiences with our research through stand up comedy.
Year(s) Of Engagement Activity 2012
 
Description British HCI Conference Demo 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact The Go-with-the-flow app was selected to be presented at the British HCI Conference with attendees from academia and industry.
Year(s) Of Engagement Activity 2015
 
Description Keynote at Storytelling & Space in the Virtual Age 2017 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Keynote on the use of wearable sensing technology to support phisical rehabilitation
Year(s) Of Engagement Activity 2017
URL http://digitaltag.zhdk.ch/en/conference/program/
 
Description Keynote on pain and technology at the SMART Summer School on Computational Social and Behavioural Science, Paris, France 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact Key note at summer school. It led students to consider body (and not just face) as a modality for affect recognition.
Received invitation to give a kynote at Pain Face Day 2017
Year(s) Of Engagement Activity 2016
 
Description Keynote, Deep Learning Summit, London, UK 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact Introducing the importance of body movement as an affective modality for affective-aware technology design
Year(s) Of Engagement Activity 2016
 
Description Keynote, Disruptive, Madrid, Spain (Industry) 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Invited to provide the industry with an understanding of affective-aware technology in 5 years time.

Exposure to EMo&Pain project and to the team
Year(s) Of Engagement Activity 2016
 
Description Keynote, European Sensory Network (ESN) workshop, Vienna, Austria 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact Body sensing technology, automatic recognition of emotions through body movement and touch, phisical rehab technology
Year(s) Of Engagement Activity 2016
 
Description Presented at Being Human Festival 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Public/other audiences
Results and Impact To explore the 'sounds that move us' through multisensory science and philosophy. Featuring talks by musicians and researchers, demonstrations of sonic illusions, digital mapping, and even 'sonic shoes', this evening of talks, experiments and interactive demonstrations invites you to explore the ways in which music and sounds speak to more than just our ear: http://beinghumanfestival.org/

Run by the School of Advanced Study in partnership with the British Academy, the Arts and Humanities Research Council and the Wellcome Trust, Being Human is the UK's only national festival of the humanities. It aims to provide opportunities for non-academic audiences to experience how the humanities can inspire and enrich our everyday lives.
Year(s) Of Engagement Activity 2015
 
Description Presented at the British Pain Society Technology day 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact Presented at the Technology day for British Pain Society. Raised visibility of our research and discussion around methods.
Year(s) Of Engagement Activity 2012
 
Description Putting people at the centre of digital health. UCL Festival for Digital Health 
Form Of Engagement Activity Participation in an open day or visit at my research institution
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Public/other audiences
Results and Impact About 100 people attended this event I co-organized and chaired at the UCL Festival for Digital Health, 30th June 2014. This event comprised short
of presentations and interactive demonstrations of work at our group. The event sparkled discussion on understanding healthcare and human error,
on sensing motion/emotion in healthy and chronic pain populations and on feedback to increase motivation, to promote behaviour change and to
provoke changes in body perception.

Apart from the informative nature of this activity, through this event we broaden our research network, and started new collaborations.
Year(s) Of Engagement Activity 2014
URL http://www.fdh.ucl.ac.uk/event/human-factors-digital-health/
 
Description Science Museum - Wearables event 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact The event generated discussion and engagement and people reported a better understanding of wearables for other conditions other than general activity tracking.
Year(s) Of Engagement Activity 2015
 
Description TEDxStMartins Talk 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? Yes
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact TEDxStMartins Talk.

The talk presented the work Berthouze's team is doing on automatic emotion recognition from body expressions. TEDx talks are on an invitation-basis only and they have high visibility.

http://tedxcentralsaintmartins.com/videos/

http://www.youtube.com/watch?v=PXTrBqSw4-A
Year(s) Of Engagement Activity 2012
URL http://tedxcentralsaintmartins.com/videos/