Understanding individual differences in facial emotion perception and their association with psychiatric risk indicators

Lead Research Organisation: Queen Mary University of London
Department Name: Sch of Biological & Behavioural Sciences

Abstract

We use facial expressions to communicate our emotions but there are large differences in how people react to these expressions. At its most severe, failures to respond appropriately to emotional expressions are associated with many clinical conditions. For example, people with high anxiety often report that what other people perceive as neutral expressions appear threatening to them. Critically, the reason why these expressions look different to certain individuals remains unknown. Without this knowledge, it is very difficult to develop targeted and effective therapies for people suffering, for example, from anxiety or other disorders. The aim of this project is to understand why some people are unable to correctly respond to, or identify, particular emotional expressions, and how this relates to indicators of psychiatric risk.

We have recently created a unique toolkit that allows us to measure how emotional expressions look to different people. We will use our toolkit to allow individuals to create the facial expressions that they associate with particular emotions. These individualised emotional expressions will allow us to understand the basis of individual differences in emotional expression perception and how they are associated with psychiatric risk indicators. We will establish if these evolved expressions improve current clinical and pre-clinical tests of "emotional processing" using standard, pre-validated tasks. The result of this work will be an improved understanding of how emotional expressions are interpreted in typical and atypical populations, new tools for better characterization of individuals at risk of psychiatric disorders, and therefore the potential for more precise diagnosis and treatment of these disorders.

Technical Summary

The aim of this project is to understand why atypical processing of emotional facial expressions is associated with many clinical disorders. Although it is well established that atypical processing is associated with increased psychiatric risk, the cognitive basis of atypical processing is not known, and this has prevented progress in diagnosis and treatment. Knowing why some people show atypical processing requires knowing what emotional expressions look like to different people, but technical limitations have previously made it impossible to establish what an emotional expression looks like to an individual. We have now overcome this limitation by combining ground breaking computer graphics techniques and state-of-the-art behavioural methods. The new tools use genetic algorithms that allow individuals to create the facial expressions that they associate with particular emotions. In each iteration of the algorithm, selected characteristics are bred to create a new 'generation' of facial expressions. The process is repeated until convergent on an expression associated with a particular emotion ('happy', 'sad', 'angry').

We propose to use these 'evolved' faces to gain knowledge about emotion expression recognition in typical and high trait populations, thereby improving clinical and pre-clinical studies of emotional processing. We will use the new tools to allow participants to create their own facial expressions. We will then use machine-learning techniques to quantify and characterise the facial configurations created by typical and by individuals with high trait anxiety or psychopathy. We will use these evolved expressions to establish the role of perceptual differences in atypical emotional processing in high trait groups, by using them in conventional clinical and pre-clinical emotional tests instead of standard faces. We will ensure maximum uptake of our toolkit by making it open access, and easy to use.

Planned Impact

Our work will examine the source of individual differences in how people respond to emotional expressions, using new tools to uncover the mechanisms (perceptual or response) involved. This will be of substantial interest to a large proportion of the scientific and clinical community and we will publicize our work through invited national and international talks by the PIs and PDRAs. To ensure maximum impact both PDRAs will receive training throughout by (a) presenting at leading international conferences, (b) multiple cross-site visits allowing them to acquire a broad set of skills in disparate areas that will place them at a unique advantage for their future careers (c) contributing to our two workshops QMUL and (d) meeting with external experts and our steering committee whose end users' needs will inform our toolkit development.

This project will advance our knowledge about how emotional expressions are perceived and interpreted in humans, with important repercussions for the understanding of emotion processing and mental health. To achieve these goals we will build on our new tools that allow, for the first time, an individualized assessment of emotion processing. The outcomes of the proposed work will provide unique insights into typical and atypical emotion processing, and we have identified two main targets to achieve impact beyond the academic community.

Societal:
Mental health problems are on the rise and understanding the mechanisms underlying some of these problems are critical for the development of precision treatments / therapies and will inform the development of mechanistic models of emotion / affective processing. We will ensure that our results feed into practice and policy sectors via our workshops, interactions with the Steering Committee and co-I Viding who chairs the Child and Youth Mental Health Research Strategy for the Department of Health. Notably, the assessment of disordered populations would benefit from greater precision regarding underlying processing biases that contribute to symptom development. This Increased precision would enable more personalised approaches for prevention and treatment.

Digital technology:
The new tools that we will develop and distribute represent a significant advance on previous methods for studying and classifying emotion processing. To promote widespread uptake by the scientific and medical communities as well as educators and industry, we will advertise the toolkit at conferences, our workshops and during invited talks. The toolkit will be an open access, stand alone application easily downloadable that will promote transparency and replication by standardising methodology. We have identified the following primary beneficiaries of these new tools:
1) Scientific Community: Current methods for testing participants require preconfigured images of actors. Our toolkit will allow a rapid, individualised approach such that participants can create the facial expressions that match their internalized representations. Importantly, these tailored stimuli can then be used in any experiment. Our method is versatile (we can create tailored facial expressions using different identities), and we therefore expect it to be taken up in a wide range of research areas, from prosopagnosia (poor facial recognition) to research investigating human-computer interactions.
2) Healthcare Community: Clinical tests are limited by the amount of time available with individual participants. The new toolkit will allow individualised emotional stimuli to be created very quickly and will therefore be particularly well suited for use by children and clinical populations where time and participant interest are often limited.
3) Industrial: Our toolkit is likely to have wide impact in broader society and industry. The procedures that we will develop can be used to guide the design of robots or avatars, such as automated carers, that replicate emotional expressions to promote acceptability

Publications

10 25 50

publication icon
Binetti N (2022) Genetic algorithms reveal profound individual differences in emotion recognition. in Proceedings of the National Academy of Sciences of the United States of America

 
Description Adapting a transformational education intervention in a humanitarian crisis.
Amount £359,490 (GBP)
Organisation The British Academy 
Sector Academic/University
Country United Kingdom
Start 03/2020 
End 03/2022
 
Description CAMERA 2.0
Amount £3,401,653 (GBP)
Funding ID EP/T022523/1 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 11/2020 
End 10/2025
 
Description European Regional Development Fund - Research and Innovation: call in West of England (OC37R17P 0696)
Amount £1,801,368 (GBP)
Funding ID 37R18P02612 
Organisation European Union 
Sector Public
Country European Union (EU)
Start 12/2019 
End 12/2022
 
Description Leverhulme Trust
Amount £299,589 (GBP)
Funding ID RPG-2020-024 
Organisation Queen Mary University of London 
Sector Academic/University
Country United Kingdom
Start 01/2021 
End 01/2024
 
Title Improved GA tool for facial expression evolution 
Description Method for creating facial expressions with a simple user interface 
Type Of Material Improvements to research infrastructure 
Year Produced 2020 
Provided To Others? Yes  
Impact Multiple papers published / in press / in preparation. 
 
Title Data set stored on Open Science Framework 
Description This is GA data collected on 293 people and forms the basis of a paper currently under review at PNAS. Each participant evolved four facial expressions using our new (GA) toolkit and their data is stored, as the blend shape weights (150 point that define the face). 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact This dataset demonstrates that the GA toolkit that we developed produces realistic faces that can be used for research or experimental purposes. 
URL https://osf.io/dyfau/
 
Title Facial Evolution Tool 
Description We have created software that allows users to create faces using genetic algorithms. Moving forward we will allow the software to be used on a web platform for mass user testing. Open source versions will follow later in the project. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2020 
Impact Multiple papers under preparation 
 
Description Centre for Brain and Cognitive Development Birkbeck invited talk 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Postgraduate students
Results and Impact 30 postgraduate students and approximately 5 academics attended this talk about the GA toolkit and how it could be used to measure emotion recognition in children
Year(s) Of Engagement Activity 2022
URL https://cbcd.bbk.ac.uk/events/seminar_series_timetable
 
Description Dissemination event 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Other audiences
Results and Impact This activity was part of our planned final dissemination event / workshop. We invited local and international speakers to talk about new developments in emotion research and used the opportunity to present the outputs of the grant (notably the new software on the OSF website) as well.
Year(s) Of Engagement Activity 2023
URL https://www.eventbrite.co.uk/myevent?eid=460162657897
 
Description Keynote Speaker - ACM CVMP 2021 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Keynote talk at ACM conference
Year(s) Of Engagement Activity 2021
URL https://www.cvmp-conference.org/2021/keynotes/
 
Description Microsoft Invited Talk 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact Invited talk at Microsoft on Digital Humans for Mixed Reality
Year(s) Of Engagement Activity 2021
 
Description University of Leuven invited talk 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Postgraduate students
Results and Impact The purpose was to make people aware of our new method to measure emotion recognition (available on the OSF website).
Year(s) Of Engagement Activity 2022