Analysing Dynamic Change in Faces

Lead Research Organisation: Queen Mary University of London
Department Name: Computer Science

Abstract

Humans are very good at understanding and interpreting the motion of others peoples faces. We can effortlessly recognise emotions and interpret subtle facial behaviors such as sardonic smiles, thoughtful frowns or questioning looks, but the question remains, how do we do this? We need new computer based tools to be able to explore this fascinating area of psychology. In this project we will develop a new form of three-dimensional camera system that will allow us to record the movements of people's faces and then process this video information to discover the components of movements that go to make them up. Once we are able to discover the parts of movements that add together to make familiar facial expression we can use this to be able to create new faces; in much the same way as a music mixing desk allows you to blend together different sounds, we will have software that allows us to mix new faces with whatever expressions we select. Using this new tool we can then carry out experiments to look at how we process faces and imitate other people's facial movement. We will examine how observing the movement in one persons face can be translated into movements of our own face to imitate the action. Because the faces we use are created in the computer we can manipulate them in any way we like. This new technology will allow us to address a large set of basic questions. Can we imitate a person if the face seen only from the side or if it is shown upside down? Do we do better when we imitate our self, a friend or a stranger? We can even create caricatures of faces, where we exaggerate particular movements, to evaluate how these facial gestures are represented in the human face processing system. A better understanding of how imitation works will help us understand social behaviors and their development, and also help in developing computer systems that can both recognise and react to our facial expressions. The new face mixing software will also have commercial applications, for example it can be of use in the computer games and entertainment industry. Movements from one persons face can be used as the instructions to be transferred to create another persons face making the same movement. This will allow for example a voice actor to control the movements of a characters face in addition to simply providing the expressive dialogue, the generation of high quality realistic synthetic actors or faster more efficient ways to video conference over your mobile phone.

Publications

10 25 50
 
Description The objectives were; To develop new tools for 3D facial motion capture and analysis. To characterise modes of variation in meaningful segments of facial motion.To utilise artificially generated stimuli and novel measurement techniques to allow psychophysical experiments exploring the perception of facial action and the imitation of human facial motion.To deliver interdisciplinary science based educational outreach to UK schools.

Key findings: To allow a working system we developed a new approach to image motion analysis that characterised the bright-dark, yellow-blue and red-green opponent channels of the human colour system as chromatic derivatives. We incorporated chromatic derivatives into our existing spatio-temporal brightness derivative method for motion and binocular disparity calculation and demonstrated improved performance.

We built PCA spaces across individuals rather than across expressions to investigate "family resemblance" between different classes. We used a novel technique of mapping a vector representing a deviation of a male face from the male mean into a female face space. This resulted in a female "sibling". We showed that the "sibling pairs" looked more alike than a random pairing indicating "family resemblance" may be encoded by similar vectors referenced to the average of classes of faces.

The same technology can be used to visualise our prejudices. We found that average Conservative and Labour MP's faces were indistinguishable. However average faces rated as strongly labour or strongly conservative did look distinctively different and were correctly matched to their stereotypical category by participants in a follow-up experiment.

We tested the ability of participants to identify a facial action projected onto a computer graphic avatar as being generated by themselves, a friend or another person. If based on experience they should find it easiest to recognise a friend. However we found participants could recognise themselves and their friends from the motion alone when upright but only themselves for upside down faces. Disrupting the timing of the motion showed self-recognition was based on rhythmic cues we have about our own facial motion.
Exploitation Route we have developed links with sme Black swan data and have licences an element of the above research tor development into a product. A secondment of an RA was made possible with support form EPSRC impact acceleration account award to college.

'Building the emotion machine's' technological background is based on the EPSRC funded research 'Analysing Dynamic Change in Faces - EP/F037384/1' by McOwan, using the biologically inspired methods and computer vision techniques to develop systems capable of tracking faces, recognising facial expressions and so supporting affective computing applications. This work was extended to extract contextual cues and visual features of a user allowing the detection of the level of his/her engagement within a human-machine interaction system.
Built on a 3D distribution of facial features extracted we can track a face and recover the valence of emotion and contextual interest to infer the users' emotional state and appropriately modify the contextual behaviour to enhance the interaction quality. The feeling index fuses data on the facial expression, head direction and the contextual cues from the interaction to accurately and robustly predict the engagement of the user. The system has been successfully tested in long-term interactions in real social scenarios as a part of the Framework 7 project LIREC (Living with Robots & Interactive Companions) where an extended empirical evaluation was undertaken in real social settings to validate the theory and technology framework and outputs developed.

Proof Of Concept App
• An iOS iPad application
• Face detection/tracking with discrete and continuous affective states detection.
• Measure effectiveness and accuracy of the face detection solution to detect defined AU (Action Units)
• Build companies understanding of the background technology to explore potential methodologies and commercial proof of concept demos
• Developed application to demonstrate the technology to potential clients
• The POC was also used as demonstration purposes in science outreach events.
• Accuracy and reliability of the face detection was enhanced
the Affective Sensing & Visualiser POC work delivered
• iOS iPad application
• Proof of concept app demonstrating: GUI interface enabling detection of users affective states whilst watching video content, Collection and aggregation of discrete and continuous state data, Export of affective data to backend platform, Import of affective data from backend platform to secondary application, Creative design and visualisation of affective data
Demo POC to potential commercial brands to gather feedback
? Internal evaluation of the POC and Implementation of new face detection solution (Visage Technologies) and updated AU detection.
Outcome
• Developed the POC applications to demonstrate the technology with demonstration to film company and high street retail chains.
• developed a better understanding of potential routes to market and strategic approaches to develop further POC apps
application

3. Immediate impact
This section is to describe the immediate impact of the project on users and stakeholders compared to how things were

The project has increased the awareness within Black Swan (including clients) of face detection and affective sensing technology as an important area of innovative.

development and collaboration is ongoing.
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Leisure Activities, including Sports, Recreation and Tourism,Manufacturing, including Industrial Biotechology,Culture, Heritage, Museums and Collections,Retail,Security and Diplomacy

 
Description Ideas first developed in this project for systems tracking facial feature expressions are currently under development via a licensing deal with a company who are interested in this face and emotion tracking method The system has undergone user tests with a number of companies who have provided feedback to help enhance the user interface Further funding from Epsrc impact acceleration fund has allowed development of a secondment of an ra to the company to develop the commercial version of the system
First Year Of Impact 2013
Sector Creative Economy,Digital/Communication/Information Technologies (including Software),Retail
Impact Types Economic

 
Description Royal Society of London
Amount £7,500 (GBP)
Funding ID IE110952 
Organisation The Royal Society 
Sector Charity/Non Profit
Country United Kingdom
Start 03/2012 
End 03/2014
 
Description Royal Society of London
Amount £7,500 (GBP)
Funding ID IE110952 
Organisation The Royal Society 
Sector Charity/Non Profit
Country United Kingdom
Start 03/2012 
End 03/2014
 
Title IMAGE PROCESSING METHOD, APPARATUS AND COMPUTER PROGRAM 
Description The present invention provides a novel algorithm for salience detection based on a dual rail antagonistic structure to predict where people look in images in a free-viewing condition. Furthermore, the proposed algorithm can be effectively applied to both still and moving images in visual media without any parameter tuning in real-time. 
IP Reference WO2018142110 
Protection Patent application published
Year Protection Granted 2018
Licensed Commercial In Confidence
Impact Existing impact (since August 2013): Supported with initial funding from the college EPSRC funded Impact Acceleration Account, Hamit Soyel undertook a secondment at Black Swan Data to help develop the product. Dragonfly soon took off within the company attracting lots of client interest. This resulted in Black Swan Data quickly having to create a dedicated product team, with me becoming a key member. The team is made up of Development, Quality Assurance, Technical Architecture, Product