The application of motion-capture technology in telematic and virtual dance performance through a framework for long-distance remote communication

Lead Research Organisation: Goldsmiths University of London
Department Name: Media and Communications

Abstract

This project involves a unique use-case and experimental exploration of the affordances of new marker-less motion-capture systems such as Perception Neuron and Posenet in the remote creation, rehearsal, teaching, and performance of dance work. Aspects of this are:

Affordable and accessible motion-capture technology as a creative tool for choreography
Real-time telematic dance communication
Remote collaborative devising of dance choreography
Online and interactive performance
Remote rehearsal of performance
Virtual choreography in virtual spaces
Choreography with intelligent machines (machine learning)
Remote teaching of dance choreography in three dimensions
Abstracted generative aesthetic visualisation of dance movement for performance

We will deliver:

A framework for long-distance remote communication of movement data for dancers (and audio-visual creatives) to practice, rehearse and perform remotely but together. This framework includes:
A code template starter-kit for creatives to design a) custom movement visualizations and b) appearances and experiences for virtual performance
A controllable interface for online audiences to interact with performances through a digital device
A database of movements and gestures that can be used to train movement models for machine-learning-based interactions between dancers and machine software

This project addresses an urgent need for useable technology for dance companies to move beyond the limitations of video conferencing platforms. As these companies pivot from touring and the production of new work to other activities, we will make a strong case for the application of time and resources into streams of research into potential futures of choreographic practice that can be digitally shared worldwide.

Publications

10 25 50
 
Description In this 2021 funded AHRC 'Covid 19' project we successfully undertook the development, testing, and implementation of an open-source software tool for the real-time streaming of motion capture data. This streaming tool has allowed us to bring dancers from different geographic locations into a shared virtual space to dance together with a convincing sense of physical copresence, direct interaction, and virtual touch. We call it Goldsmiths Mocap Streamer, and in 2021 we had the opportunity to successfully demonstrate the framework and share our findings to an international audience through a series of livestreams, workshops, and festival events.

In our own performance showcases in May 2021, and November 2021 (for the UK Being Human Festival) we evidenced not only that a sense of liveness could be achieved in the virtual space, but also that this could be done with a set of generative and aesthetic visual effects that render the dance in radically new ways - in turn abstract, narrative, and illusionistic. In June 2021 at VRHAM! (Hamburg) and in October 2021 for the BFI London Film Festival, we also brought multiple, remotely connected, and fully embodied dancers into a Virtual Reality (VR) performance environment in real-time, in partnership with INVR Spaces Gmbh (INVR), and Alexander Whitley Dance Company (AWDC).

MAY 2021 LIVESTREAM https://www.youtube.com/watch?v=2RXzz_p6LvI&t=407s
NOV 2021 LIVESTREAM https://youtu.be/aNg-gqZNYR0
BFI LFF LIVESTREAM https://youtu.be/VKvLQqJ0Po8

Goldsmiths Mocap Streamer was initially developed to address the issues surrounding the isolation and physical distancing of the Covid19 restrictions, by first connecting dancers remotely through a framework for motion-capture data streaming, and then seeing what kinds of emotional, aesthetic, and affective connections could be made within virtual spaces. These concerns were framed and led by our collaborating artists, honed and developed through a series of practice-led research workshops with artists both local to Goldsmiths in London, and as far away as Singapore (LASALLE), New York (Gowanus Loft, Brooklyn) and Hong Kong (HKAPA). However, we knew that beyond the immediate concerns of the pandemic, this tool had a much broader applicability, possibly offering itself as a critical turning-point in a 50-year history of the aesthetic and technological evolution of telematic performance practice. Now effectively framed by the publicity surrounding Facebook's conceptualisation of a 'Metaverse' interactive VR space, we see our research as a step towards the modes of fully embodied interactions that can and will take place within this kind of environment. In many ways it seems inevitable that dancers and performers will be working in this way in the future, and we want to be able to develop and position our research as a significant part of working towards that reality.

We have also published two articles, with one more forthcoming, and there are two book chapters forthcoming.
Exploitation Route While we ourselves have applied successfully to follow-on-funding for impact and engagement, we also have an open-source code Github repository where we freely share our software. This repository, as well as a set of asset starter kits that we have created, will come strongly into play as we disseminate our research through a global network of dance practitioners (within the scope of our next FoF project).

We are working closely with our new partners Noitom (the makers of the motion capture system Perception Neuron) towards full integration of our own tool with their system for applications in Android VR and with games engine Unreal Engine.

Our dance partner Alexander Whitely Dance Company has taken up and used our mocap streamer tool for teaching and creation with a dance school in Hong Kong, and further to this we are currently developing a new collaborative VR performance 'Future Rites', with virtual venue creators INVR Space (Berlin). We hope to showcase this performance later in 2022, and there is another FoF proposal current submitted for this project.

We have, throughout this project, created a global awareness of the kind of high-quality and high-integrity work that can be created with our framework for motion capture data streaming, and through our Q&A activities, practical workshops (in Being Human Festival 2021), and authored articles we have critically analysed both the strengths and weaknesses of working with such a system. These findings, shared in this way, will be of great use to others moving into this field of remote digital 'telematic' performance.
Sectors Digital/Communication/Information Technologies (including Software),Leisure Activities, including Sports, Recreation and Tourism,Culture, Heritage, Museums and Collections

URL http://mocapstreamer.live
 
Description Through several panel discussions, workshops, engagement events, and showcases both nationally and internationally we have provided an extremely strong use case for motion capture in the creation of remote telematic dance work. This evidence has been used in 2022 within a global open-call for artists residency projects for our current follow on funding project, through which we have got 160 applications from all over the world. This shows that we have managed to create a good global awareness of the potential of the framework for dance communication that we have built. In 2022 PI Dan Strutt has been invited and funded by the organisation British Underground to attend South by Southwest Festival in Austin, Texas in March 2022, to speak on a panel about the future of dance in virtual environments. This serves as evidence of the impact of the research in the public realm, and testifies also to the positioning of British research internationally. More anecdotally, we are being contacted daily about new projects and collaborations around the mocap streamer, to such an extent that we cannot respond to all enquiries. This shows that we have successfully positioned ourselves within a burgeoning area of creative practice and research, and I hope to sustain our work in the field for many years to come.
First Year Of Impact 2021
Sector Creative Economy,Digital/Communication/Information Technologies (including Software),Culture, Heritage, Museums and Collections
Impact Types Cultural

 
Description British Academy Innovation Fellowships Scheme 2022-23 (Route A: Researcher-led)
Amount £110,507 (GBP)
Funding ID IF2223\230051 
Organisation The British Academy 
Sector Academic/University
Country United Kingdom
Start 03/2023 
End 02/2024
 
Description Building an international network for virtual dance collaboration: Deploying Goldsmiths MoCap Streamer tool for inclusive and sustainable development
Amount £120,791 (GBP)
Funding ID AH/W006863/1 
Organisation Arts & Humanities Research Council (AHRC) 
Sector Public
Country United Kingdom
Start 11/2021 
End 01/2023
 
Title Goldsmiths Mocap Streamer 
Description This software streams motion capture data omnidirectionally via a cloud server to any location. It is currently compatible only with Perception Neuron motion capture system. 
Type Of Technology Software 
Year Produced 2021 
Open Source License? Yes  
Impact This tool has been used to create and share dance work internationally, and is currently also being used by our project partners in their own practice. Soon it will be rolled out globally through our FoF project 'Building an international network for virtual dance collaboration: Deploying Goldsmiths MoCap Streamer tool for inclusive and sustainable development' 
 
Description BFI London Film Festival 2021 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact For the BFI London Film Festival 'Expanded' Programme we developed a work in VR that could be joined from anywhere in the world. This was in partnership with Alexander Whitely Dance Company, and was followed with a discussion/dialogue with the makers of the Virtual 'Expanse' Venue INVR Spaces, and with the choreographer Alexander Whitely.
Year(s) Of Engagement Activity 2021
 
Description May 2021 Performance Showcase 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact This was a one-hour performance showcase designed to demonstrate the functionality of our mocap streamer tool. With one dancer in the room, we brought a further two dancers into a virtual space, so that we could see three avatar bodies interacting in real-time. The audience was fully online. This was our first showcase, presented with our project partner Mavin Khoo, and was followed by a lively Q&A
Year(s) Of Engagement Activity 2021
 
Description November 2021 Performance Showcase for Being Human Festival 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact In November 2021, for Being Human Festival and presented from Goldsmiths, we offered a one-hour showcase performance to show the final results of our research process. Here we had two dancers with us at Goldsmiths, and one dancer connected remotely from New York. This performance was simultaneously livestreamed in the USA through Midheaven Network.
Year(s) Of Engagement Activity 2021
 
Description Pandemic and Beyond Podcasts 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact Daniel Strutt and other Team memebers took part in two podcast discussions with the Pandemic and Beyond team at Exter Uni. https://pandemicandbeyond.exeter.ac.uk/media/podcasts/
Year(s) Of Engagement Activity 2021
 
Description SXSW 2022 - Panel Mocap Streaming and remote dance collaboration in VR 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact This was a panel discussion at South by South West Festival 2022. This panel discussed the technical, creative and conceptual issues around distributed performance experiences in VR. The conversation focussed on the VR Mocap Streamer project, a partnership between Alexander Whitley Dance Company and the Department for Media and Communications at Goldsmiths University, London, which attempts to bring live, fully embodied interactive performance from remote locations into a multi-user VR experience based on the seminal ballet The Rite of Spring.

A lively discussion followed. This has directly led to the inclusion of one of our project outputs in SXSW 2023 XR Experience Spotlight programme - on the 10th to 14th March - called Figural Bodies. We also have a panel discussion 'Dance in the Metaverse - Inclusion and accessibility'. These activities are funded by British Underground, in the Future Art and Culture programme - produced by British Underground at SxSW funded by Arts Council England with additional support in 2023 from the British Council.
Year(s) Of Engagement Activity 2022
URL https://schedule.sxsw.com/2022/events/PP116279
 
Description VRHAM! 2021 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact At this Virtual Reality Arts Festival, which took place both in-person (in Hamburg) and in VR, we presented a VR dance work with Alexander Whitely Dancer Company. I also participated in a VR panel discussion about Virtual Reality performance.
Year(s) Of Engagement Activity 2021