Mapping the Cultural Landscape of Emotions for Social Interaction

Lead Research Organisation: University of Glasgow
Department Name: School of Psychology

Abstract

With rapid globalisation, cross-cultural communication is integral to modern society, with mutual understanding of emotions key to successful social interaction. Yet, some cultures misinterpret facial expressions, challenging the widely accepted 'universal language of emotion.' My research aims to understand the complexities of cross-cultural emotion communication.
Focusing on the decoding of facial expressions, I showed in Current Biology that East Asian (EA) groups misinterpret facial expressions due to a culture-specific decoding strategy that selects ambiguous eye information. Refuting universality, my work elicited worldwide interest (Discovery Channel Magazine, National Geographic, BBC News Front Page).
In Journal of Experimental Psychology: General, I then showed that Western Caucasian (WC) and EA groups mentally represent emotions using distinct expressive features. Building on these data further, I used state-of-the-art facial animation technology to construct 4-D mental models of the 6 basic facial expressions of emotions in each culture. Here, I show culture-specific patterns of facial signals (i.e., EA signal emotional intensity with early eye activity) and that emotion is not universally organized into 6 basic categories. These data refute universality, highlighting the need to bridge knowledge gaps.
During the grant, I aim to bridge these gaps by conducting the following set of experiments:
1. Conceptual landscape of emotions. The 6 basic emotions are not universal, but WC-specific. Using hierarchical semantic network reconstruction tools, I will map the conceptual organisation of emotion words to reveal the main emotion categories central for social interaction in WC and EA culture.
2. Spectrum of culture-specific facial expressions. Using the emotion terms above and the 4-D facial animation platform, I will reconstruct 4-D models of a spectrum of socially relevant facial expressions in each culture. Using established analysis tools I will precisely identify the facial signals that characterise culture-specific facial expressions.
3. Cross-cultural emotion recognition. Current facial expression stimuli are recognised only by WC cultures, so do not universally represent emotion, limiting knowledge of cross-cultural emotion communication. Providing a broad spectrum of culture-specific 4-D facial expression models will greatly improve current stimuli, including the first ever set of EA valid facial expressions. Thus, I will conduct a previously impossible fully balanced examination of same- and other-culture facial expression recognition, using eye-tracking to identify the facial signals supporting emotion recognition and creating confusion across cultural boundaries.
4. In-group advantage. The 4-D stimuli can flexibly interchange race of face with cultural emotions (e.g., EA emotion on a WC face). By isolating the effects of two group identifiers (race and cultural emotions) on emotion recognition, I will address key questions in the in-group advantage theory.
Applications & Benefits. My data will have broad implications for science and society.
Science. Providing a full account of cross-cultural emotion communication will bridge knowledge gaps and re-ignite key Social Psychology topics. Improved stimuli will advance research on emotion processing at the behavioural and brain level. More broadly, Computer Science (e.g., Human-Robot Interaction) will gain practical benefits from improved models of human emotion, extending to the commercial gaming industry and the digital economy in developing avatars and companion robots.
Society. Worldwide interest in my research shows that society highly values cross-cultural interactions. Providing advice on culture-specific emotions will improve social communication, fostering healthy relations across society. My data will make a timely contribution to the rapidly evolving communication needs of society with great benefits for the economic performance of the UK.

Planned Impact

With rapid globalisation and cultural integration, cross-cultural communication is crucial in modern society. Yet, consistent cultural differences recognising facial expressions refute notions of a universal language of emotion. My research aims to provide a comprehensive examination of cross-cultural emotion communication, with considerable impact for science and society.

SOCIAL PSYCHOLOGY. Knowledge of emotion is largely limited to a single, static set of facial expressions, restricted in emotional range and ecological validity (i.e., Western specific; naturalistic dynamic signals absent). By providing a spectrum of 4-D culture-specific facial expression models, I will advance knowledge of emotion communication beyond the limits of the universality hypothesis. My work will have significant impact on key theories of emotion communication, raise new questions, and guide future research directions. As high-grade stimuli, the 4-D models can also address outstanding questions, including key brain-imaging studies [13].

COGNITIVE NEUROSCIENCE/VISUAL COGNITION. By demonstrating how culture shapes thought (mental models, conceptual landscapes) and biological systems (eye movements), my data will advance knowledge of the functional relationship between high-level cognition and visual systems. Culture-specific information sampling and representations also likely alters neural signals, revealing further biological differences, impacting on brain-imaging communities.

CLINICAL NEUROSCIENCE. Abnormal emotion processing creates major impairments to social relations. Modelling mental representations of facial expressions would have great impact for assessment (i.e. precisely identifying the absence of facial signals) and rehabilitation (i.e., targeting deficits in emotion representation) in various groups at risk of social isolation (e.g. autism, depression, anxiety). To this aim, I will continue liaising with my collaborator, Ralph Adolphs (CalTech, USA) - a world leader in emotion and clinical groups.

COMMUNICATION TECHNOLOGY. Companion robots are a major investment in European funding. Advancing knowledge of cultural emotion communication will have great impact for the development of communication avatars and companion robots (designed to recognise and express emotions) in different cultures, as reflected by the broad reach of my previous work in Computer Science [45] and Human-Robot Interaction [46].

INDUSTRIAL PARTNERSHIPS. The digital economy relies on a precise understanding of social signals for automated recognition and expression. While the 3-D-movie/animation industry relies on point-light displays, my 4-D models will have greater generalisation. With my mentor, Prof. Schyns, I am now liaising with Microsoft Research Labs (http://research.microsoft.com/en-us/labs/cambridge/), Dimensional Imaging, Scotland (http://www.di3d.com/index.php) and Vicarvision (SMRgroep http://www.smr.nl/), Netherlands.

MUSEUMS/GALLERIES. My work has featured in international public forums (e.g. National Geographic, Discovery Channel Magazine, BBC News Front Page). The curator of the Hunterian Museum and Gallery plans to create an exhibit focusing on historical, biological, anthropological, medical and cultural aspects of facial expressions, using my 4-D models of facial expressions as a centerpiece. Thus, my data will continue to make central contributions to public knowledge of cross-cultural emotion communication.

SOCIETY. Cross-cultural interactions are common in local communities (e.g. schools, businesses, social services). Providing knowledge on cultural emotions will improve communication, fostering healthy relations across society. My results will make a timely contribution to the rapidly evolving communication needs of society, benefiting the economic performance of the UK.
Thus, my data will increase awareness of cultural differences, with the long-term benefits realised by applying knowledge during social interactions.

Publications

10 25 50
 
Description Facial expressions are widely considered "the universal language of emotion." Yet, many cultures misinterpret basic facial expressions, questioning their validity as universal signals. Here, we aimed to understand how different cultures signal basic emotions.

First, using a state-of-the-art Generative Face Grammar, I reconstructed 3-D dynamic models of the 6 basic facial expressions in two cultures, to reveal clear cultural specificity in both facial expression signals and the conceptual organisation of basic emotions. My data raised new questions: how are emotions conceptually organised by different emotions? How do cultures represent these emotions as facial expressions?

To address these questions, I used semantic network reconstruction tools to map the map the conceptual organization of a broad spectrum of emotions in two distinct cultures (Western and East Asian). I then used clustering methods to identify culture-specific conceptual groupings of emotions. For each emotion word in each culture, I then modelled and validated its corresponding dynamic facial expression, producing over 60 culturally valid facial expression models. I then pooled the resulting models and applied a multivariate data reduction technique to reveal four culturally common facial expression patterns. I also showed that each facial expression pattern communicates specific combinations of valence, arousal and dominance. Finally, I revealed the face movements that accentuate each culturally common facial expression pattern to create complex facial expressions. Here, my data questions the widely held view that six facial expression patterns are universal, instead suggesting four latent expressive patterns with direct implications for emotion communication, social psychology, cognitive neuroscience, and social robotics.

I also examined the dynamical nature of facial expressions of emotion using information theory and Bayesian classifiers applied to models of the six basic facial expressions of emotion. Here, I showed that dynamic facial expressions transmit an evolving hierarchy of signals over time, where early, simpler face signals support the discrimination of four emotion categories - i.e., (i) happy, (ii), sad, (iii) fear/surprise, and (iv) disgust/anger - whereas later, more complex signals support discrimination of all six emotions. My data question the widely accepted notion that human emotion comprises six basic categories, instead suggesting four. This research finding won the international Innovation Award 2016 awarded by the Social and Affective Neuroscience Society.

My results have broad implications for science and modern society. My data advances knowledge of emotion communication beyond the limited set of six Western specific static face images, thereby informing emotion communication to foster healthy societal relations, including clinical groups. My improved facial expression models provide practical benefits for Social Science research, Computer Science, Human-Robot Interaction, the commercial gaming industry, and the digital economy (e.g., development of avatars and companion robots). Together, my data makes a timely contribution to the rapidly evolving communication needs of society with great benefits for the economic performance of the UK.
Exploitation Route My results have broad implications for science and modern society. My data advances knowledge of emotion communication beyond the limited set of six Western specific static face images, thereby informing emotion communication to foster healthy societal relations across different cultures, including clinical groups. My improved facial expression models provide practical benefits for Social Science research, Computer Science, Human-Robot Interaction, the commercial gaming industry, and the digital economy (e.g., development of avatars and companion robots). Together, my data makes a timely contribution to the rapidly evolving communication needs of society with great benefits for the economic performance of the UK.

For example, my dynamic facial expression models are now being transferred to virtual humans by my international collaborator, Prof. Stacy Marsella (http://www.ccs.neu.edu/people/faculty/member/marsella/) - a world leader in the design of virtual humans with a view to improving human user interaction. I am also applying my dynamic facial expression models to my recently acquired FurHat robot head http://www.furhatrobotics.com/ to design social robots of the future that can display realistic behaviours that reliably impact human user social judgments.

We showcased our results and their many applications at the prestigious Royal Society Summer Science Exhibition 2015, which allowed me to disseminate my results to the public, and engage their interests in psychological research.
Sectors Communities and Social Services/Policy,Digital/Communication/Information Technologies (including Software),Culture, Heritage, Museums and Collections

URL http://www.altmetric.com/details.php?citation_id=2016990
 
Description How have your findings been used? Please provide a brief summary. SOCIETAL/PUBLIC INTEREST/IMPACT MEDIA COVERAGE. My work has received considerable public interest, as reflected by activity on Blogs, Twitter, Weibo, Facebook, Google+, Linkedln, and YouTube videos. For Jack et al., 2014 (Current Biology) see following link for activity in each domain: http://www.altmetric.com/details.php?citation_id=2016990. The color-coded key on the left shows the level of activity in each domain. Please also see the DEMOGRAPHICS tab, which shows that 73% of Twitter posts were made by the public, 20% by scientists, 5% practitioners. My work has also featured in a number of high profile media outlets (e.g., Time Magazine, Smithsonian Magazine, Royal Society Summer Science Exhibition) and various international news outlets (e.g., BBC, Associated Press, Press Trust of India). Jack et al., 2014 (Current Biology) was the best-performing story of the month, internationally. Below is a representative (by no means exhaustive) sample of media outlets where my work has featured. Interest from across the globe: BBC, Associated Press, Press Trust of India, Xinhua News Agency, NPR (USA), News Talk radio (Ireland), Las Ultimas Noticias (Chile), and National News London. Featured in: Time Magazine, Smithsonian Magazine, BBC Online, The Scotsman, Daily Mail, Daily Express, Metro, The Herald, Courier & Advertiser, Red Orbit, UPI, New Indian Express, The Atlantic, Yahoo (US), Geobeats, Education UK, People's Daily, Global Times, Nature World News, Times of India, New Straits Times, IBN Live, Biospace, Psychics Universe, Medical News Today, Brightsurf.com, Medical Xpress, UK Wired News, Kelowna Now, World News, WSB-TV (Atlanta), Dayton Daily News, News Channel 4 (Oklahoma), The Onion, EurekAlert, Business Standard. Together, the extensive international coverage of my work reflects the immediate public interest in my work and its value to society. MUSEUMS/GALLERIES. The curator of the Hunterian Museum and Gallery plans to create an exhibit focusing on historical, biological, anthropological, medical and cultural aspects of facial expressions, using my 4-D models of facial expressions as a centerpiece. Thus, my data will continue to make central contributions to public knowledge of cross-cultural emotion communication. ACADEMIC BENEFICIARIES As shown in "Score in Context" here http://www.altmetric.com/details.php?citation_id=2016990 Jack et al., (2014) is amongst the highest ever scored in the journal (IF: 10.227), ranked 6th for all other articles of the same age, 38th (out of 3613) in all articles in the journal, and is in the 99th percentile of articles in all journals. Mendeley's readership includes Psychology, Computer and Information Science, Biological Sciences, Medicine, and Humanities. My work has been cited in a broad range of academic peer-reviewed journals including brain imaging (e.g., Vecchiato et al., 2014), behavioural methods in psychology for visual (e.g. Naples et al., 2014) and auditory stimuli (e.g., Choi et al., 2014), Artificial Intelligence (e.g., Kotsakis et al., 2014, Kalliris et al., 2014), postgraduate theses in Engineering (e.g., Sandel, A 2014), undergraduate theses in Psychology (e.g., Sheppard, 2014) a book (communication skills for care workers - van Alphen, 2014) and featured in the BPS Readers Digest (July 2014). Based on the success of my work, I have developed a number of international and national collaborations to build dynamic models of facial expression signals in clinical groups such as Autism Spectrum Disorder with Prof. Ralph Adolphs (CalTech, USA) and Prof. Jim Tanaka (U. Victoria, Canada) - both of whom are world leaders in emotion processing in Autism Spectrum Disorder groups - high suicide risk with Prof. Rory O'Connor (U. Glasgow, UK), across different cultures (e.g., Japan - Dr. Michiko Koeda, Nippon Medical School; China - Prof. Hongmei Yan, U. Electronic Science and Technology of China; Philippines - Miss Erin Mercado, The Mind Museum; Mozambique - Prof. Jose-Miguel Fernandez-Dols, U. Madrid, Spain), other socially relevant face signals such as smile types (Prof. Paula Niedenthal - Wisconsin-Madison, USA), emotional intensity (Prof. Daniel Messinger - Miami, USA), circumplex model of emotions (Prof. Jim Russell - Boston, USA), pain and pleasure (Prof. Jose-Miguel Fernandez-Dols, U. Madrid, Spain) and gamers (Prof. Daphne Bavelier - Rochester, USA/Geneva). Together, these demonstrate the immediate impact my work has had on the academic community, by providing new opportunities to examine dynamic facial expression signals across a broad range of fields. My work has also featured as a Case Study in Dimensional Imaging 4D (http://www.di4d.com/case-glasgow-uni.html) demonstrating how we combine computer graphics with subjective perception and psychophysics to gain a competitive edge in advancing knowledge of dynamic social signals. Our results are now being used to design virtual humans by my international collaborator, Prof. Stacy Marsella (http://www.ccs.neu.edu/people/faculty/member/marsella/) - a world leader in the design of virtual humans. I am also applying my dynamic facial expression models to my recently acquired FurHat robot head http://www.furhatrobotics.com/ to design social robots of the future that can display realistic behaviours that reliably impact human user social judgments. We have now published these results in the Proceedings of 13th Fifth IEEE International Conference on Automatic Face Gesture Recognition (Chen et al., in press). We showcased our results and their many applications at the prestigious Royal Society Summer Science Exhibition 2015, which allowed me to disseminate my results to the public, and engage their interests in psychological research.
First Year Of Impact 2014
Sector Digital/Communication/Information Technologies (including Software),Leisure Activities, including Sports, Recreation and Tourism,Culture, Heritage, Museums and Collections
Impact Types Cultural,Societal

 
Description BA/Leverhulme Small Research Grants
Amount £9,735 (GBP)
Funding ID SG171783 
Organisation The Leverhulme Trust 
Sector Charity/Non Profit
Country United Kingdom
Start 09/2017 
End 09/2019
 
Description EPSRC Institutional Sponsorship Fund
Amount £20,000 (GBP)
Organisation University of Glasgow 
Sector Academic/University
Country United Kingdom
Start 06/2015 
End 06/2016
 
Description ESRC Impact Accelerator Fund
Amount £1,000 (GBP)
Organisation University of Glasgow 
Sector Academic/University
Country United Kingdom
Start 05/2015 
End 05/2016
 
Description ESRC collaborative studentships
Amount £80,000 (GBP)
Organisation Economic and Social Research Council 
Sector Public
Country United Kingdom
Start 10/2017 
End 09/2021
 
Description Early Career Researcher, Rewards for Excellence
Amount £18,990 (GBP)
Organisation University of Glasgow 
Sector Academic/University
Country United Kingdom
Start 06/2015 
End 06/2016
 
Description European Research Council Starting Grant
Amount £137,265,641 (GBP)
Funding ID FACESYNTAX - GAP-759796 
Organisation European Research Council (ERC) 
Sector Public
Country Belgium
Start 09/2018 
End 09/2023
 
Description Glasgow Knowledge Exchange Fund
Amount £2,000 (GBP)
Organisation University of Glasgow 
Sector Academic/University
Country United Kingdom
Start 05/2015 
End 05/2015
 
Title Mapping the cultural landscape of emotions for social interaction 
Description  
Type Of Material Database/Collection of data 
Year Produced 2016 
Provided To Others? Yes  
 
Description Building a Culturally Flexible Generative Model of Face Signalling for Social Robots 
Organisation Dimensional Imaging Limited
Country United Kingdom 
Sector Private 
PI Contribution We provide research questions, data collection, analysis tools, feedback on products, connections with academia, links to outreach projects (e.g., public engagement events).
Collaborator Contribution Our collaborators provide expert technical advice, and hardware and software demos.
Impact This collaboration has underpinned most of my work on modelling dynamic facial expressions from Jack et al., (2012, PNAS) until the most recent publication Chen et al., (2018) and a review article in Annual Review of Psychology (2017). We now hold a fully ESRC funded 1+3 MSc + PhD collaborative studentship with DI4D. They also supported our 2015 Royal Society Summer Science Exhibition.
Start Year 2012
 
Description Dynamic Modelling of Face Signals of Social Smiles 
Organisation University of Wisconsin-Madison
Country United States 
Sector Academic/University 
PI Contribution I provided my collaborators - Prof Paula Niedenthal (University of Wisconsin - Madison, USA), Dr. Magdalena Rychlowska (Cardiff University, UK), and Dr. Jared D. Martin (University of Wisconsin - Madison, USA) - with methodological and analysis tools, performed data analysis, created the figures, and contributed to study design and writing of the manuscript (now published in Psychological Science).
Collaborator Contribution My collaborators provided a research question, contributed to the study design, data analysis, and writing of the manuscript. They collected the data.
Impact Manuscript published in Psychological Science.
Start Year 2013
 
Description Dynamic Modelling of Face Signals of Social Smiles 
Organisation University of Wisconsin-Madison
Country United States 
Sector Academic/University 
PI Contribution I provided my collaborators - Prof Paula Niedenthal (University of Wisconsin - Madison, USA), Dr. Magdalena Rychlowska (Cardiff University, UK), and Dr. Jared D. Martin (University of Wisconsin - Madison, USA) - with methodological and analysis tools, performed data analysis, created the figures, and contributed to study design and writing of the manuscript (now published in Psychological Science).
Collaborator Contribution My collaborators provided a research question, contributed to the study design, data analysis, and writing of the manuscript. They collected the data.
Impact Manuscript published in Psychological Science.
Start Year 2013
 
Description Dynamic Modelling of Face Signals of Social Smiles 
Organisation University of Wisconsin-Madison
Country United States 
Sector Academic/University 
PI Contribution I provided my collaborators - Prof Paula Niedenthal (University of Wisconsin - Madison, USA), Dr. Magdalena Rychlowska (Cardiff University, UK), and Dr. Jared D. Martin (University of Wisconsin - Madison, USA) - with methodological and analysis tools, performed data analysis, created the figures, and contributed to study design and writing of the manuscript (now published in Psychological Science).
Collaborator Contribution My collaborators provided a research question, contributed to the study design, data analysis, and writing of the manuscript. They collected the data.
Impact Manuscript published in Psychological Science.
Start Year 2013
 
Description Modelling Dynamic Facial Expressions of Pain and Pleasure Across Cultures 
Organisation Autonomous University of Madrid
Country Spain 
Sector Academic/University 
PI Contribution We contributed the study design, data collection, analysis, writing of the manuscript, and methodological tools.
Collaborator Contribution They provided a research question, contributed to the study design, writing of the manuscript, and data collection.
Impact We published a paper in PNAS. We presented our work as an oral presentation at an international conference in the US.
Start Year 2015
 
Description Modelling Mental Representations of Facial Expressions of Emotions in Gamers 
Organisation University of Geneva
Country Switzerland 
Sector Academic/University 
PI Contribution We contributed to the study design and writing of the manuscript, performed data analysis, provided methodological and analysis tools, created figures.
Collaborator Contribution They provided a research question, contributed to the study design, data analysis, and writing of the manuscript. They collected the data.
Impact We have submitted a manuscript to a peer reviewed scientific journal.
Start Year 2013
 
Description Art of Possible - Rise of the Robots 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Industry/Business
Results and Impact Showcasing the novel techniques I use to derive formal models of face signals. Brings together Scotland's creatives, policy-makers, STEM professionals to consider civic uses for world-leading emerging technologies. Run by Glasgow City Council, The Lighthouse, Glasgow City of Science & Innovation, Cultural Enterprise Office, Technology Scotland. Sparked questions, discussion and interest during and interactive event afterwards.
Year(s) Of Engagement Activity 2018
URL https://www.technologyscotland.scot/events-on-ts/art-of-possible-rise-of-the-robots/
 
Description Catalyst Magazine 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Schools
Results and Impact Magazine is aimed at at teenagers studying biology at school.
Showcased our research on dynamic facial expressions across cultures.
Year(s) Of Engagement Activity 2015
 
Description European Researchers Night 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact Exhibition called Talking to Robots, attracts >2k visitors
Showcasing our recent work on modelling facial expressions and their impact for social robotics.
Attendees learned more about our research, which generated discussion and interest.
Year(s) Of Engagement Activity 2016
 
Description Glasgow Science Festival 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Public/other audiences
Results and Impact Glasgow Science Festival exhibition on modelling dynamic facial expressions
Attendees learned of the research and its potential impact, which generated discussion and interest
Year(s) Of Engagement Activity 2014
 
Description Glow (Scottish Schools National Intranet) 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Schools
Results and Impact Broadcast to primary school children a demonstration of what I do as a scientist and how I do it. A Q&A followed.
Year(s) Of Engagement Activity 2015
URL https://connect.glowscotland.org.uk/
 
Description Our Dynamic Earth 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Public/other audiences
Results and Impact Exhibition called Making Robot Faces.
Our Dynamic Earth has >4m visitors per year.
Showcasing the novel techniques I use to derive formal models of face signals.
Attendees learned about the research, which also generated discussion and interest.
Year(s) Of Engagement Activity 2016
 
Description Royal Society Summer Science Exhibition 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact The University of Glasgow & Dimensional Imaging Reveal the Hidden Information in Faces.

The University of Glasgow's Institute of Neuroscience and Psychology were one of the key exhibitors at the highly prestigious Royal Society Summer Science Exhibition with their 'Face Facts' exhibit.

The Royal Society Summer Science Exhibition is an annual display of the most exciting cutting-edge science and technology in the UK. The week-long festival features 22 exhibits from the forefront of innovation. Further information about the event can be found at: http://sse.royalsociety.org/2015. Over 10,000 members of the public attended.

Showcased cutting-edge research, 12k+ attendees (including policy makers), 7-day interactive exhibit, 1 of 22 selected, 1 of 4 featured. Exhibited computational methods to study social face perception, including techniques developed with external industrial partner (Dimensional Imaging 4D; www.di4d.com).

Using a 3D facial imaging system from pioneering Scottish technology company Dimensional Imaging (DI4D™), the exhibit highlighted how computer graphics can be used to reveal the information hidden in faces. We scanned over 700 people's faces at the event: ranging from a 10 month old baby girl to a 100 year old woman. Once each face was scanned and processed, participants were able to interact with their own face in 3D on the touchscreen - by rotating the face to view different angles, or animating it with different expressions using facial muscle 'sliders'.

Participants could also see the location of different face movements using specific colour maps. We also used our computer graphic techniques to make composite faces with the average shape and colour information of the women, men, girls and boys we scanned. Attendees were surprised to find out how our research on dynamic facial expressions has shown that cultures interpret facial expressions differently, whilst having fun at the same time. Attendees loved having their faces scanned in 3D. Participants could also upload their animations to the 'Face Facts' website enabling them to play around with their face and even create new ones. 218 people uploaded at least one video to the website and 476 in total were uploaded at the end of the event.

I used these visible platforms to establish industrial partnerships and successfully apply my research-based knowledge of social face signals to design social robots via two on-going collaborations, which I am preparing as impact stories post REF 2020:
Year(s) Of Engagement Activity 2015
URL https://royalsociety.org/science-events-and-lectures/summer-science-exhibition/