MAPTRAITS: MACHINE ANALYSIS OF PERSONALITY TRAITS IN HUMAN VIRTUAL AGENT INTERACTIONS

Lead Research Organisation: Queen Mary University of London
Department Name: Sch of Electronic Eng & Computer Science

Abstract

Research findings suggest that personality traits such as extraversion, agreeableness, and openness to experience, are tightly coupled with human abilities and behaviour encountered in daily lives: emotional expression, linguistic production, success in interpersonal tasks, leadership ability, general job performance, teacher effectiveness, academic ability, as well as interaction with technology. In fact, human users tend to anthropomorphise computers and virtual agents, treating them as social beings, and interpreting their behaviour similarly to daily human-human interactions.

The problem of assessing people's personality is very important for multiple research and business domains such as computer-mediated staff assessment and training, human-computer and human-robot interaction. Despite a growing interest and emphasis on personality traits and their effects on human life in general, and recent advances in machine analysis of human behavioural signals (e.g., vocal expressions, and physiological reactions), pioneering efforts focusing on machine analysis of personality traits have started to emerge only recently: (i) there exist a small number of efforts based on unimodal cues such as written texts/ audio/ speech/ static facial features, (ii) despite tentative efforts on multimodal personality trait analysis, the dynamics (duration, speed, etc.) of multiple cues, which have been shown to be important in human judgments of personalities, have mostly been neglected, (iii) although personality analysis research suggests that a trait exists in all people to a greater or lesser degree (i.e. a person can be anywhere on a continuum ranging from introversion to extraversion), none of the proposed efforts have attempted to assess personality traits continuously in time and space (i.e., how a person can be rated along the multiple trait dimensions at a given interaction time and context), and (iv) how machine (automatic) traits analysis can be utilised for personalised, social and adaptive human - virtual agent interaction has not been investigated.

Overall, both the common everyday technology (e.g., personal PCs, smart phones) and the more sophisticated systems people use nowadays (e.g., computer games, assistive technologies, embodied virtual agents, etc.) lack the capability of understanding their human users' personality and behaviour, and of providing socially intelligent, adaptive and engaging human - computer interaction.

To address these issues and limitations, MAPTRAITS project will bring around a set of audio-visual tools that can analyse and predict human personality traits dynamically from multiple nonverbal cues and channels (i.e., upper body, head, face, voice and their dynamics) in continuous time and trait space. There is no prospect of building a perfect system for automatic analysis of personality traits that can be used in all possible application domains in 12 months' time. Therefore, as a proof-of-concept, the MAPTRAITS technology will be developed for automatic matching of virtual agent and user personalities, to automatically model what type of users would like to engage with what type of virtual agents to the aim of user engagement enhancement. The motivation for choosing this application area lies in its significance: (i) Research has shown that people's attitudes toward machines and conversational agents is based on the perceived personality of the agent, and their own personality, and (ii) humans are social beings, and currently their everyday life revolves around interacting with computers, virtual agents and robots that are getting increasingly popular as companions, coaches, user interfaces to smart homes, or household robots.

Planned Impact

The MAPTRAITS project will focus on machine analysis of personality traits. It will bring around a set of audio-visual tools to analyse and predict human personality traits dynamically from multiple nonverbal cues and channels in continuous time and trait space (represented with multiple trait dimensions), to match virtual agents with human users.

The line of research targeted in this funding application is highly interdisciplinary and the research outcomes can be spun into the outer society in various ways. Relevant beneficiaries are identified based on the expected term of impact.

In the short term, the output of the research proposed will benefit national and international research groups working in the area of social signal processing, human behaviour analysis and interpretation, and embodied and relational agent community and applications. The most immediate application of the proposed research is in the areas of embodied conversational agent and relational agent design by addressing issues related to personalised, adaptive and engaging user - agent interaction, and by providing such agents with the means of analysing and understanding the nonverbal behaviour and personality of their users, to guide the agent's social interaction and personalisation ability. This in turn will improve their effectiveness in various application domains related to learning and health care, namely virtual tutoring, elderly care, cyber coaching and cyber therapy. From an interdisciplinary point of view, the knowledge gained will impact the understanding of the role of the personality traits in human - agent interactions.

In the medium term, a system capable of assessing human personality traits displayed in naturalistic and spontaneous settings (non-verbal behaviours expressed through facial expressions, body gestures, and vocal cues) has the potential to alter significantly other existing technologies, and become applicable and expandable to multiple domains. Examples of such domains are: (1) recommending interfaces to people (matching people's personality traits with household robots, or virtual agents); (2) personal wellness technologies (prescription of exercise programs taking the patient's personality into account, prescription of therapies for people suffering from personality disorders, and guided therapy); (3) e-learning (to assess a student's personality and engage the student in studying that is most suited to his or her personality); and (4) pre-employment assessment (matching personality traits with job requirements).
 
Description The most recent findings of the MAPTRAITS project published online in Jan 2016 show that: (1) varying situational context causes the manifestation of different facets of people's personality; (2) human raters' impressions do not change much for conscientiousness, openness and likeability as compared to those for agreeableness, extroversion and facial attractiveness from one context to another; and (3) time-continuous prediction is better suited for the dimensions of agreeableness, conscientiousness, neuroticism, openness, vocal attractiveness and likeability (Celiktutan & Gunes, IEEE TAC 2016); (4) Assessments provided by each rater vary, but we can measure and model the credibility of each rater (Joshi, Gunes and Goecke, ICPR'14); (5) Facial attractiveness perception and computation is affected not only by the static facial features, but also by the subject's behaviour (Kalayci, Ekenel and Gunes, ICIP'14).
We extended our research from the MAPTRAITS project to the Being There project, and conducted comparative experiments with the extroverted versus introverted robot condition showing that: (1) perceived enjoyment with the NAO robot is found to be significantly correlated with participants' extroversion trait (which validates the similarity rule); (2) NAO robot's perceived empathy positively correlates with participants' extroversion trait - extroverted people feel more control over their interactions and judge them as more intimate and less incompatible; (3) perceived enjoyment with the robot is highly correlated with the agreeableness trait of the participants; (4) a significant relationship is also established between perceived robot's realism and the neuroticism trait of the participants - people who score high on neuroticism tend to perceive their interactions as being forced and strained, therefore artificial behaviours of the robot might appear to them as realistic (Celiktutan and Gunes, IEEE RO-MAN 2016).
Exploitation Route The findings of the MAPTRAITS project are now being incorporated into the design of the EPSRC 'Being There: Humans and Robots in Public Space' project (EP/L00416X/1) for telepresence and human-robot interactions. These findings are currently being used for human-robot interaction for creating socially intelligent humanoid robots that can personalise and adapt to their users' personality traits for appropriate level of engagement and adaptation.
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Financial Services, and Management Consultancy,Healthcare,Leisure Activities, including Sports, Recreation and Tourism

URL https://maptraits.wordpress.com/key-findings/
 
Description Adaptive Robotic EQ for Well-being (ARoEQ)
Amount £871,444 (GBP)
Funding ID EP/R030782/1 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 03/2019 
End 03/2024
 
Description College PhD studentship
Amount £90,000 (GBP)
Organisation Queen Mary University of London 
Department Barts and The London School of Medicine and Dentistry
Sector Academic/University
Country United Kingdom
Start 11/2012 
End 12/2015
 
Description EPSRC Digital Personhood Sandpit
Amount £400,000 (GBP)
Funding ID ECSA1N4R/S 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 10/2013 
End 09/2016
 
Description Enhancing User Experience in Retail
Amount £562,000 (GBP)
Organisation Innovate UK 
Sector Public
Country United Kingdom
Start 04/2016 
End 03/2018
 
Title MAPTRAITS Continuous Dataset 
Description The MAPTRAITS Continuous Dataset has been created for continuous prediction of traits in time and in space. The dataset comprises 30 audio-visual clips of 10 subjects who are taking part in a dyadic interaction. The raters used an annotation tool to view each clip and to continuously provide scores over time by scrolling a bar between 0 and 100. Approximately 32-40 visual-only annotations per clip were obtained for the five dimensions of the Big Five model, as well as the social dimensions of engagement, likability and facial attractiveness, and additionally 25 audio-visual annotations per clip were obtained for the dimensions of agreeableness, conscientiousness, openness, engagement and vocal attractiveness. 
Type Of Material Database/Collection of data 
Year Produced 2014 
Provided To Others? Yes  
Impact Due to the development of this database, it was possible to send a proposal to organsie the 1st International Audio/Visual Mapping Personality Traits Challenge and Workshop and have it accepted at the 16th ACM International Conference on Multimodal Interaction (ACM ICMI'14). The MAPTRAITS Continuous Dataset forms the basis for the Sub-challenge on Continuous Personality Prediction. 
URL https://maptraits.wordpress.com/datasets/
 
Title MAPTRAITS Quantized Dataset 
Description The MAPTRAITS Quantized Dataset comprises audio-visual interaction clips of 11 different subjects. These clips have been assessed by 6 raters along the five dimensions of the Big Five personality model, namely, extraversion, agreeableness, conscientiousness, neuroticism, and openness, and the four additional dimensions of engagement, facial attractiveness, vocal attractiveness, and likability. The dimensions were scored on a Likert scale with ten possible values, from strongly disagree to strongly agree, mapped into the range from [1,10]. 
Type Of Material Database/Collection of data 
Year Produced 2014 
Provided To Others? Yes  
Impact Due to the development of this database, it was possible to send a proposal to organsie the 1st International Audio/Visual Mapping Personality Traits Challenge and Workshop and have it accepted at the 16th ACM International Conference on Multimodal Interaction (ACM ICMI'14). The MAPTRAITS Quantized Dataset forms the basis for the Sub-challenge on Quantised Personality Prediction. 
URL https://maptraits.wordpress.com/datasets/
 
Description ACM MM'14 Panel 
Organisation Australian National University (ANU)
Country Australia 
Sector Academic/University 
PI Contribution We submitted a panel proposal to ACM Multimedia'14 Conference titled 'Looking for emotional and social signals in multimedia: Where art thou?'. The panel took place on November 4th, 2014 in Orlando (USA), prominent academics and researchers from the industry, and conference participants from around the world, discussed the significance of emotional and social signals for multimedia research. The panel has been a great success in drawing the attention of the researchers working in other fields to analysing personality, social signals and emotions.
Collaborator Contribution The partners worked on the proposal, contributed in suggesting topics and panellists, and ran the panel.
Impact We have been invited to launch a new area at ACM Multimedia'14 on 'Emotional and Social Signals in Multimedia', and act as the three Area Chairs to promote this emerging area. The collaboration is multidisciplinary as it brings together the disciplines of multimedia, audio processing, computer vision, affective computing, and social signal processing.
Start Year 2013
 
Description ACM MM'14 Panel 
Organisation Delft University of Technology (TU Delft)
Country Netherlands 
Sector Academic/University 
PI Contribution We submitted a panel proposal to ACM Multimedia'14 Conference titled 'Looking for emotional and social signals in multimedia: Where art thou?'. The panel took place on November 4th, 2014 in Orlando (USA), prominent academics and researchers from the industry, and conference participants from around the world, discussed the significance of emotional and social signals for multimedia research. The panel has been a great success in drawing the attention of the researchers working in other fields to analysing personality, social signals and emotions.
Collaborator Contribution The partners worked on the proposal, contributed in suggesting topics and panellists, and ran the panel.
Impact We have been invited to launch a new area at ACM Multimedia'14 on 'Emotional and Social Signals in Multimedia', and act as the three Area Chairs to promote this emerging area. The collaboration is multidisciplinary as it brings together the disciplines of multimedia, audio processing, computer vision, affective computing, and social signal processing.
Start Year 2013
 
Description ACM MM'14 Panel 
Organisation Technical University of Munich
Country Germany 
Sector Academic/University 
PI Contribution We submitted a panel proposal to ACM Multimedia'14 Conference titled 'Looking for emotional and social signals in multimedia: Where art thou?'. The panel took place on November 4th, 2014 in Orlando (USA), prominent academics and researchers from the industry, and conference participants from around the world, discussed the significance of emotional and social signals for multimedia research. The panel has been a great success in drawing the attention of the researchers working in other fields to analysing personality, social signals and emotions.
Collaborator Contribution The partners worked on the proposal, contributed in suggesting topics and panellists, and ran the panel.
Impact We have been invited to launch a new area at ACM Multimedia'14 on 'Emotional and Social Signals in Multimedia', and act as the three Area Chairs to promote this emerging area. The collaboration is multidisciplinary as it brings together the disciplines of multimedia, audio processing, computer vision, affective computing, and social signal processing.
Start Year 2013
 
Description ITU student internship 
Organisation Istanbul Technical University
Country Turkey 
Sector Academic/University 
PI Contribution QMUL provided desk and lab space for visiting students from Istanbul Technical University. I initiated the internship, provided the project idea on automatic analysis of beauty, and provided the dataset together with the relevant annotations. I closely supervised the project during the visit and actively worked on writing the conference paper.
Collaborator Contribution Sacide Kalayci from Istanbul Technical University visited my team at QMUL in the period of July-September 2013. ITU provided support and allowed their students to visit QMUL during the summer of 2013 for conducting research.
Impact The student visit enabled development of international research collaborations between QMUL and ITU. The internships also enabled the visiting students improve their communication and team work skills. One publication has been jointly generated as part of this collaboration: S. Kalayci, H. K. Ekenel, and H. Gunes, "Automatic Analysis of Facial Attractiveness from Video", Proc. of IEEE International Conference on Image Processing (IEEE ICIP'14), Oct. 2014, Paris, France, pp. 4191-4195.
Start Year 2013
 
Description MAPTRAITS Workshop 
Organisation Technical University of Munich
Country Germany 
Sector Academic/University 
PI Contribution In March 2014, we (QMUL) initiated the Mapping Personality Traits Challenge and Workshop (MAPTRAITS) series as a competition event aimed at the comparison of signal processing and machine learning methods for automatic visual, vocal and/or audio-visual analysis of traits and social dimensions. The 1st International Audio/Visual Mapping Personality Traits Challenge and Workshop is organised in conjunction with 16th ACM International Conference on Multimodal Interaction on 12 Nov. 2014 in Istanbul.
Collaborator Contribution The TUM team is co-organising with us the 1st International Audio/Visual Mapping Personality Traits Challenge and Workshop. They extracted informative audio and vocal features and trained models to provide baseline personality and social state prediction results. These were readily provided to the Challenge participants. The TUM team also contributed to the conference paper we wrote on this topic.
Impact The collaboration resulted in one conference publication: O. Celiktutan, F. Eyben, E. Sariyanidi, H. Gunes, and B. Schuller, "MAPTRAITS 2014: The First Audio/Visual Mapping Personality Traits Challenge ", Proc. of ACM Int' Conf. on Multimodal Interaction (ACM ICMI'14), Nov. 2014, Istanbul, Turkey. The collaboration is multidisciplinary as it brings together the disciplines of audio processing, computer vision, affective computing, social signal processing, and multimodal interaction.
Start Year 2013
 
Description UoC student visit 
Organisation University of Canberra
Country Australia 
Sector Academic/University 
PI Contribution QMUL provided desk and lab space for the visiting student. I provided the project idea, the dataset and the annotations to be used for creating an automatic personality analyser, closely supervised the project during the visit and actively worked on writing the conference paper.
Collaborator Contribution Jyoti Joshi from University of Canberra (UoC) visited my team at QMUL in the period of May-July 2013. Jyoti worked on extracting visual features and using machine learning techniques for automatic prediction of perceived personality traits using visual cues. Jyoti also worked on the conference paper we wrote on this topic.
Impact This enabled development of international research collaborations between QMUL and UoC. The internship also enabled the visiting students to get to know London and QMUL as well as improve their team work skills. One publication has been jointly generated as part of this collaboration: J. Joshi, H. Gunes and R. Goecke, Automatic Prediction of Perceived Traits using Visual Cues under Varied Situational Context , Proc. of International Conference on Pattern Recognition (ICPR'14), pp. 2855-2860.
Start Year 2013
 
Title Annotation Master 
Description This is a Java-based annotation tool that can be used to annotate human affective and social signals displayed in audio-visual clips in a continuous manner. Annotation Master was developed at Queen Mary University of London by Bhavisha Motichande as a final year undergraduate project. 
Type Of Technology Software 
Year Produced 2013 
Impact Annotation Master has been used for obtaining continuous annotations for the MAPTRAITS Continuous Dataset that has been subsequently used for the MAPTRAITS'14 Challenge. 
URL http://maptraits.wordpress.com/software/
 
Title PSTR 
Description Probabilistic Subpixel Temporal Registration (PSTR) implements the PSTR framework in C++ as presented in: E. Sariyanidi, H. Gunes and A. Cavallaro, Probabilistic Subpixel Temporal Registration for Facial Expression Analysis, Proc. of the Asian Computer Vision Conference (ACCV'14), Singapore, Nov. 2014 (accepted for oral presentation). 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2014 
Impact This robust and powerful registration technique is expected to improve the recognition accuracy of the techniques we have developed for both automatic personality prediction and automatic affect recognition. 
URL https://maptraits.wordpress.com/software/
 
Title QLZM 
Description QLZM implements the Quantised Local Zernike Moments (QLZM) image representation in C++ as presented in the following conference paper: E. Sariyanidi, H. Gunes, M. Gokmen, and A. Cavallaro, Local Zernike Moment Representation for Facial Affect Recognition , Proc. of the British Machine Vision Conference (BMVC'13), Bristol, UK, Sep. 2013. QLZM has been used for the MAPTRAITS'14 Challenge visual feature extraction. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2013 
Impact QLZM has been released to the research community and has already been used for extracting and representing the visual features of the MAPTRAITS'14 Challenge data. 
URL https://maptraits.wordpress.com/software/
 
Description Keynote talk on 'Affective and Social Signal Processing for Human-Computer-Robot Interactions' at the 6th Audio/Visual Emotion Challenge and Workshop (AVEC 2016) 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Dr Hatice Gunes gave an invited keynote talk titled 'Affective and Social Signal Processing for Human-Computer-Robot Interactions' at the 6th Audio/Visual Emotion Challenge and Workshop (AVEC 2016) organised in conjunction with ACM Multimedia 2016 in Amsterdam, the Netherlands on 16 October 2016. The purpose of the invitation and the talk was to expose the participants to research topics that would be somewhat outside their area of expertise and the scope of the workshop - personality computing and human-robot interaction. The audience was very engaged, asking questions about HRI and providing suggestions. This keynote talk led to Dr Hatice Gunes being invited to give other seminars and talks on similar topics.
Year(s) Of Engagement Activity 2016
URL http://dl.acm.org/citation.cfm?doid=2988257.2988271
 
Description Live Demo at the IEEE Automaitc Face and Gesture Recognition Conference 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact We had a live demo with NAO robot at the IEEE Automatic Face and Gesture Recognition Conference in Ljubljana, Slovenia. Participants interacted with the robot, one at a time, answering questions about their thoughts, feelings, and experiences. The system then predicted their personality based on the way they were interacting with the robot.
This live demo received a lot of attention by the Slovenian media, both national radio and TV, as well as other conference participants.
Year(s) Of Engagement Activity 2015
URL http://4d.rtvslo.si/arhiv/prispevki-in-izjave-slovenska-kronika/174334071
 
Description One-day lecture on 'Affective and Social Signal Processing' with hands-on group activities at the Visum 2016 Summer School 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact Dr Hatice Gunes gave an one-day lecture on 'Affective and Social Signal Processing' with hands-on group activities at the Visum 2016 Summer School on 7 July 2016. Visum 2016 was the 4th VISion Understanding and Machine intelligence (visum) Summer School that targeted to gather at Universidade do Porto, Portugal, Ph.D. candidates, Post-Doctoral scholars and researchers from academia and industry with research interests in computer vision and machine intelligence. Considering the existing gap between the most fundamental concepts of computer vision and their application in real world scenarios, the realisation of visum school seeked to bridge these two key domains. By creating an expert multicultural environment, visum school aimed to foster junior researchers' awareness of computer vision topics, as well as to enhance all attendees' knowledge regarding the state of the art, provided by leading international experts on the field. Visum comprised three main tracks: fundamental, industrial and application topics, each one with extensive practical `hands-on' sessions. The goal was to enable questioning and cross-fertilization of ideas through being exposed to multidisciplinary topics and tasks, potentially leading to significant breakthroughs in the development of the theses of the student participants. Participants reported that this was indeed their experience during the summer school.
Year(s) Of Engagement Activity 2016
URL http://visum.inesctec.pt/visum-2016-4th-edition/#1479315639507-7900b722-1060
 
Description QMUL Research Open Day 
Form Of Engagement Activity Participation in an open day or visit at my research institution
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Industry/Business
Results and Impact I gave lightning talk at EECS 2015 Research Showcase at Queen Mary University of London on 22 Apr. 2015 titled Machine Understanding of Human Social Signals. The talk explained the methodologies we developed to make humanoid robots understand the expressions and personality of the people they interact with.
We had a live demo at the foyer, where participants could interact with a NAO robot and see for themselves their personality as predicted by the system.
Year(s) Of Engagement Activity 2015
URL http://www.eecs.qmul.ac.uk/research-showcase-2015/demonstrations
 
Description invited talk on 'Affective and Social Signal Processing for Human-Computer and Human-Robot Interaction' at the Hong Kong Polytechnic University (PolyU) 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Postgraduate students
Results and Impact Dr Hatice Gunes was invited to give a research seminar on 'Affective and Social Signal Processing for Human-Computer and Human-Robot Interaction' at the Hong Kong Polytechnic University (PolyU) on 22nd of June 2016.
Year(s) Of Engagement Activity 2016
URL https://www.comp.polyu.edu.hk/files/research_seminar_20160622.pdf