Performance based expressive virtual characters

Lead Research Organisation: Goldsmiths University of London
Department Name: Computing Department

Abstract

Creating believable, expressive interactive characters is one of the great, and largely unsolved, technical challenges of interactive media. Human-like characters appear throughout interactive media, virtual worlds and games and are vital to the social and narrative aspects of these media, but they rarely have the psychological depth of expression found in other media. This proposal is for the development of research into a new approach to creating interactive characters which identifies the central problem of current methods as being the fact that creating the interactive behaviour, or Artificial Intelligence (AI), of a character is still primarily a programming task, and therefore in the hands of people with a technical rather than an artistic training. Our hypothesis is that the actors' artistic understanding of human behaviour will bring an individuality, subtlety and nuance to the character that it would be difficult to create in hand authored models. This will help interactive media represent more nuanced social interaction, thus broadening their range of application. The proposed research will use information from an actor's performance to determine the parameters of a character's behaviour software. We will use Motion Capture to record an actor interacting with another person. The recorded data will be used as input to a machine learning algorithm that will infer the parameters of a behavioural control model for the character. This model will then be used to control a real time animated character in interaction with a person. The interaction will be a full body interaction involving motion tracking of posture and/or gestures, and voice input.In entertainment this method will enable more social genres and help improve the current limited demographic. It will also enable a number of new applications in education, rehabilitation, media and marketing. Putting actors in charge of creating character AI will also make production pipelines more efficient be requiring less input from programmers This project is timely in that it brings together a number of active and developing research fields including expressive virtual characters, motion capture based animation and machine learning. It has the potential to transform current research in expressive virtual characters and present new research problems for machine learning and motion capture based animation. It is novel in that it proposes a fundamentally new approach to creating interactive characters and it combines disciplines such as animation, statistical machine learning, performance, affective computing, human computer interaction and psychology. The use of both machine learning and performance for virtual characters is particularly novel.

Planned Impact

Who will benefit? The primary beneficiaries outside of academia will be the creators of 3D interactive media, particularly computer games. This will also benefit the users of similar technology in specific areas such as education, psychotherapy and the arts. Academics in areas such as computer animation and virtual reality are also likely to benefit. How will they benefit? There are two major benefits of the proposed method: 1. Integrating artists more directly into the content pipeline for character AI, thus improving the efficiency of the production process 2. Creating more expressive and subtle behaviour of characters, thus increasing the range of genres and markets available to interactive media. What will be done to ensure they benefit? Our strategy will be 3 fold: 1. knowledge dissemination: primarily through academic journals and conferences, but also potentially directly through industry through training programmes (Gillies has participated in training schemes for Electronic Arts). 2. Further development: as the proposed research is still new, it is unlikely that the results will be ready for commercial exploitation by the end of this project. Therefore, if the results are positive, further funding will be sought to develop the proposed method in an academic research context 3. Direct exploitation: in the longer term, when the method is sufficiently developed, commercial exploitation will be sought. The most likely route will be through a partnership with an existing company working in games or interactive media via knowledge transfer or commercialization funding. Intermediary software may also be released to the community if this does not conflict with commercial exploitation.

Publications

10 25 50
 
Description This project has developed a new approach to designing video game characters that can respond to our body movements and body language. Rather than trying to program explicit rules for behavior, which would make it hard to capture the subtleties of body language, our software allows people to design movements directly by moving and interacting.



We have developed two game environments in which people can customize characters' behaviour based on their own movements.



The first is a 3D version of the classic video game Pong, in which players can control the paddles and their avatars with their own movements. They were able to customize their avatars responses to winning or loosing points based on their own movements. In our user tests, our participants found that performing the actions themselves helped them understand the game better and made it easier for them to design the avatar's responses.



The second examples is a 3D character that responds to a players body movements via the Microsoft Kinect motion tracking system. The character's responses are designed based on motion capture of a real interaction between people. Two people can play the roles of the video game character and the player, showing how the character should respond by acting out the movements themselves. This allows them to design movements in a natural way, by moving, rather than having to think about mathematical rules. The motion of both participants are recorded and synchronized. This data is then used as input to a machine learning algorithm which learns an algorithm for automatically controlling a video game character so that is responds in the same way as the people designing it.



This style of design is particularly well suited to actors and performers who have a deep understanding of movement and body language. We did an in depth case study with physical theatre performer Emanuele Nargi, who used our software to design an interactive character based on his interactions with a number of members of the public.
Exploitation Route The use of interactive machine learning could be integrated within Virtual Reality development tools such as Unity3D. I am currently working on a project to do so. This would provide better tools for SMEs to create interactive, social VR experiences, creating better experiences more cheaply.
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software),Education

 
Description PRAISE: Practice and peRformance Analysis Inspiring Social Education
Amount £2,500,000 (GBP)
Funding ID 318770 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 09/2012 
End 09/2015
 
Description PRAISE: Practice and peRformance Analysis Inspiring Social Education
Amount £2,500,000 (GBP)
Funding ID 318770 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 09/2012 
End 09/2015
 
Description Research project grant
Amount £314,885 (GBP)
Organisation The Leverhulme Trust 
Sector Charity/Non Profit
Country United Kingdom
Start 01/2017 
End 12/2020
 
Title Gestyour 
Description Gestyour us a plugin for the unity3d game engine that aims to make designing movement and gesture interfaces easier. It uses Interactive Machine Learning with a visualisation to support user in debugging their interfaces. 
Type Of Technology Software 
Year Produced 2014 
URL https://www.doc.gold.ac.uk/~mas02mg/gestyour/
 
Title HMMWeka 
Description A Hidden Markov Model Library for the Weka Machine Learning Toolkit 
Type Of Technology Software 
Year Produced 2011 
URL http://doc.gold.ac.uk/~mas02mg/software/hmmweka/
 
Description A variety of talks at Games industry events 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Industry/Business
Results and Impact I gave a number of talks at games and VR industry events which included research from the projects. They were at PlayHubs (Somerset House), Bossa Studios and the DevelopVR conference.
Year(s) Of Engagement Activity 2016
URL http://www.developvr.co.uk
 
Description Body Language Based Gameplay 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact A film describing the final study of the project.

A short documentary film has been produced about the second study of the research project, which involved working with physical performer Emanuele Nargi. The video has attracted considerable attention, as of december 2012 it has had 1971 view on youtube. The video has been profiled in a number of popular technology sites including wired, develop (the UK game develop magazine) and Kurtweil Accelerating Intelligence. It has currently had 2,518 views (October 2014)

This has provided both a means to reach a wide audience (particularly mediated via articles in various online media) and also a way of easily communicating the research to other researchers and possible collaborators. For example, presenting part of the video at the recent London Virtual Social Interaction workshop (september 2014) has resulted two possible new collaborative projects.
Year(s) Of Engagement Activity 2012
URL https://www.youtube.com/watch?v=2nqhSwhsWOs
 
Description Virtual Reality MOOC 
Form Of Engagement Activity Engagement focused website, blog or social media channel
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact We developed a MOOC (Massively Open Online Course) about Virtual Reality. This aimed to be an introduction to the field of virtual reality and covered general topics, including interactive virtual characters. While the aim was a broad overview, we were able to include outcomes from this project in the curriculum.
Year(s) Of Engagement Activity 2017,2018
URL https://www.coursera.org/specializations/virtual-reality
 
Description Wired Article: Actors teach game characters the subtleties of body language 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? Yes
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact An article was published in the technology magazine Wired following a press release and interview with Marco Gillies



http://www.wired.co.uk/news/archive/2012-08/15/goldsmiths-motion-behaviour.

Following a press release by Goldsmiths and the dissemination of video of the work, a number of articles were published online about the results of the research project. The most notable was in the popular technology site wired, which resulted from an interview given by Marco Gillies:



http://www.wired.co.uk/news/archive/2012-08/15/goldsmiths-motion-behaviour

This has enabled the research to reach a wide audience.
Year(s) Of Engagement Activity 2012
URL http://www.wired.co.uk/news/archive/2012-08/15/goldsmiths-motion-behaviour