Topology-based Motion Synthesis

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Informatics

Abstract

One of the major drivers of research in the area of humanoid robotics is the desire to achieve motions involving close contact between robots and the environment or people, such as while carrying an injured person, handling flexible objects such as the straps of a knapsack or clothes. Currently, these applications seem beyond the ability of existing motion synthesis techniques due to the underlying computational complexity in an open-ended environment. Traditional methods for motion synthesis suffer from two major bottlenecks. Firstly, a significant amount of computation is required for collision detection and obstacle avoidance in the presence of numerous close contacts between manipulator segments and objects. Secondly, any particular computed solution can easily become invalid as the environment changes. For instance, if the robot were handling an object such as a knapsack, even small deformations of this flexible object and minor changes in object dimensions (e.g., between an empty bag and a stuffed bag) might require complete re-planning in the current way of solving the problem. Similar issues arise in the area of computer animation, where there is a need for real-time control of characters - moving away from static sequences of pre-programmed motion. Although it may seem that this world is much more contained, as it is created by an animation designer, there is in fact a strong desire to create games and simulation systems where the users get to interact with the world continually and expect the animation system to react accordingly. This calls for the same sort of advances in motion synthesis techniques as outlined above.The fundamental problem lies in the representation of the state of the world and the robot. Typically, motion is synthesizes in a complete configuration or state space represented at the level of generalized coordinates enumerating all joint angles and their 3D location/orientation with respect to some world reference frame. This implies the need for large amounts of collision checking calculations and randomized exploration in a very large search space. Moreover, it is very hard to encode higher level, semantic, specifications at this level of description as the individual values of the generalized coordinates do not tell us anything unless further calculations are carried out to ensure satisfaction of relevant constraints. This is particularly inconvenient when searching for a motion in a large database. The focus of this research is to alleviate these problems by developing methods that exploit the underlying topological structure in these problems, e.g., in the space of postures. This allows us to define a new search space where the coordinates are based on topological relationships, such as between link segments. We refer to this space in terms of 'topology coordinates'. In preliminary work, we have shown the utility of this viewpoint for efficient motion synthesis with characters that are in close contacts. We have also demonstrated that this approach is more efficient for categorizing semantically similar motions. In this project, we will develop a more general framework of such techniques that will be applicable to a large class of tasks carried out by autonomous humanoid robots and virtual animated characters. Moreover, we will implement our techniques on industrially relevant platforms, through our collaborators at Honda Research Institute Europe GmbH and Namco Bandai, Japan.

Publications

10 25 50
 
Description - We have developed a representation to describe complex interactions between characters, or between a character and objects or the environment, which is applicable for animation production.
- We have developed a method to interactively control characters and robots in a dynamic environment.
- We have developed a representation to describe the spatial relations of smooth surfaces in the 3D world.
- We have developed a method to classify and extract similar spatial relations.
- We have proposed a quantitative measure of how much an object is "wrapped" by a 3D surface such as a bag or cloth.
- We have proposed a method to control cloth to guide movements such as wrapping and knotting.
- We have proposed a method to compute the dense correspondense of objects with different topology
Exploitation Route - Motion retargeting of characters in computer animation and games.
- Robots to manipulate deformable objects such as cloth.
- Computing the corresponding tissues in volumetric MRI data.
- Searching a database of similar scenes.
Sectors Digital/Communication/Information Technologies (including Software),Healthcare,Manufacturing, including Industrial Biotechology,Culture, Heritage, Museums and Collections

URL http://homepages.inf.ed.ac.uk/tkomura/research_dir/research.html
 
Description - One of the findings, which is to make use of the spatial relationships as a representation, was applied for the animation synthesis in the movie "Space Pirate Captain Harlock (2013)" (see DigiPro 2013: Jun Saito, "Smooth Contact-Aware Facial Blendshapes Transfer"). - Another finding is currently being used to edit the character movement in the movie industry (in collaboration with Marza Animation Planet, www.marza.com) for the movie "Robodog" to be released in 2015.
First Year Of Impact 2011
Sector Digital/Communication/Information Technologies (including Software)
Impact Types Cultural

 
Description Marza Studentship
Amount £121,103 (GBP)
Funding ID RB0264 
Organisation Research Institute in Astrophysics and Planetology 
Sector Academic/University
Country France
Start 12/2012 
End 05/2016
 
Description Royal Society Industry Fellowship
Amount £127,296 (GBP)
Funding ID IF130118 
Organisation The Royal Society 
Sector Academic/University
Country United Kingdom
Start 10/2014 
End 09/2018
 
Title Edinburgh Interaction Database 
Description A database of human interaction with objects, captured using a magnetic motion capture system. 
Type Of Material Database/Collection of data 
Provided To Others? No  
Impact We have made the data available to arbitrary research groups who are interested in interactions. 
URL http://www.ipab.inf.ed.ac.uk/cgvu/InteractionDatabase/interactiondb.html
 
Description Collaboration with Advanced Institute of Science and Technology, Japan 
Organisation National Institute of Advanced Industrial Science and Technology
Country Japan 
Sector Public 
PI Contribution We provided the algorithm and prototype software to map the human movements to characters / robots of arbitrary topology to the research team in AIST, Japan.
Collaborator Contribution Dr. Shinichiro Nakaoka from Japan joined our research group in School of Informatics, University of Edinburgh as a visiting researcher between 2011-12. He has developed a toolkit to transfer the human motion capture data to the humanoid robot HRP-4 in Japan. The system is brought back to Japan and is used for synthesizing motion primitives of the HRP4.
Impact The work about transferring the human motion to the humanoid robot HRP4 is published in the following conference paper: Shinichiro Nakaoka, Taku Komura, "Interaction Mesh Based Motion Adaptation for Biped Humanoid Robots", IEEE Humanoids 2012. The system is also used in AIST in order to physically evaluate the workload of various human movements.
Start Year 2011
 
Description Collaboration with Disney Research 
Organisation Disney Research
Country United States 
Sector Private 
PI Contribution The PI is awarded the Royal Society Industry Fellowship for looking into techniques to make use of spatial relationships for digital acting. This project is starting on October 2014, and Taku Komura will be working half-time for Disney Research until September 2018.
Collaborator Contribution Royal Society is providing half of the PI's salary between October 2014-September 2018. Disney Research will provide the working environment for the PI in the Edinburgh office.
Impact This collaboration has just started and therefore there is no output yet. We are looking into further grants and development of IPs through this collaboration.
Start Year 2014
 
Description Collaboration with Disney Research 
Organisation The Royal Society
Country United Kingdom 
Sector Academic/University 
PI Contribution The PI is awarded the Royal Society Industry Fellowship for looking into techniques to make use of spatial relationships for digital acting. This project is starting on October 2014, and Taku Komura will be working half-time for Disney Research until September 2018.
Collaborator Contribution Royal Society is providing half of the PI's salary between October 2014-September 2018. Disney Research will provide the working environment for the PI in the Edinburgh office.
Impact This collaboration has just started and therefore there is no output yet. We are looking into further grants and development of IPs through this collaboration.
Start Year 2014
 
Description Collaboration with the Film Industry (Marza Animation Planet) 
Organisation Marza Animation Planet
Country Japan 
Sector Private 
PI Contribution We have provided ideas and algorithms for mapping the facial movements from one character to multiple characters. This tool was deployed in the movie "Space Pirate Captain Harlock" released in 2013, and the details of the techniques are described in the paper "Smooth contact-aware facial blendshapes transfer", DigiPro 2013. Also, Marza Animation Planet has provided a research fund for us to further explore the idea of using the spatial relationship for animation production. One new student, Daniel Holden, has started his PhD funded by this grant, and we are currently developing algorithms and techniques which are going to be used for the production of the film "Robodog".
Collaborator Contribution Jun Saito from Marza Animation Planet is currently co-supervising the PhD student Daniel Holden. He has taught basic animation techniques and programming styles in the industry to the student. Also, we are having weekly meetings to check the progress made by the student.
Impact The initial collaboration has resulted in the following publication: Jun Saito, "Smooth contact-aware facial blendshapes transfer", DigiPro 2013 The paper is not co-authored by the PI due to commercial reasons, but the PI has provided significant amount of ideas and effort for the core of this paper (this can be viewed in the reference). Also, this has led to the new grant starting from 2012.
Start Year 2012