vRSP: virtual (Re)Sounding of Place

Lead Research Organisation: University of Surrey
Department Name: Music


The vRSP project explores new, responsive, immersive, and interactive methods of experiencing a performance, as well as allowing users to explore the unique heritage places for which these performances have been created. The project brings significant expertise and assets to explore this area, in terms of both production and consumption of performance. It builds on the team's expertise in site-specific performance, 3D audio recording and (re)construction, performance capture, live imagery, video game development tools virtual and virtual/augmented/mixed reality.

The core project team includes Michael Price, one of the UK's most sought-after composers, known for EMMY-winning compositions for BBC's Sherlock and Unforgotten as well as work on films including Peter Jackson's The Lord of the Rings trilogy, Richard Curtis' Love Actually, Bridget Jones: The Edge of Reason and Alfonso Cuaron's Children of Men. Alongside his Film/TV work, and recordings with the label Erased Tapes Records. vRSP will allow Price to work together with University of Surrey academics Kirk Woolford and Tony Myatt. Woolford is an expert on interactive media content creation with more than 150 interactive works created and 25 years' experience developing immersive performances, 20 years of developing VR performances, and 7 years' work with AR. Myatt is the head of Music and Media at the University of Surrey, founding director of the University of York's Music Research Centre, one of the foremost authorities on spatial audio. All three core partners are active creative practitioners and perform their works world-wide.

vRSP will allow the partners to address issues around the development of immersive audio together with BAFTA, Surrey Engineering's Centre for Vision Speech and Signal Processing, 5G Innovation Centre (including the 5G Gaming Initiative and DCMS hub), Guildford-based Games and VR development companies (nDreams, Supermassive, EA/Criterion). The network will be led by challenges emerging from creative productions, and disseminated through well-developed networks.

vRSP will develop outputs that are exemplars of a wholly new genre of art work, designed for virtual space but based on notable and rare historic places for which music will be written to integrate musical and social memory, and exploit unique architectural and acoustic environments. Live performances of newly composed works will be captured and disseminated to viewers and listeners as VR content through the application of a series of novel production techniques. These will enhance and address specific difficulties presented by current technologies, as described below, by applying interdisciplinary knowledge that combines research findings from human perception, neuroscience, musical composition, animation, visual surface-mapping techniques and visual signal processing, stereoscopic 360 degree video and three-dimensional audio recording and reproduction methods. The network will draw together experienced practitioners from all fields to inform the creation of a new work that will portray the intrinsic qualities of the historic performance space through site specific live performance and a combination of contemporary video- and animation-based techniques for VR generation.

Planned Impact

VRSP will generate impact through development of new performance methodologies, best practice and tools for creating compelling digital actor performances in 360 VR/AR environments. VR is a major opportunity for the UK's world leading creative industry - worth £84.1 billion and employing 2.8m people - with investment of $4bn in immersive displays and a predicted global market size of $150bn for AR/VR content by 2020. vRSP collaborators including creative leaders in Music, Film, Broadcast, Video Games and Digital Arts are ideally placed to exploit research advances for commercial, economic and cultural impact. Techniques for developing immersive 3D audio will have widespread impact beyond VR/AR for video games, movie production, education, training and simulation.

Commercial exploitation of VRSP's ground breaking outputs will be led by stakeholder partners in Music content creation and performance for VR, games, film and broadcast. vRSP partners will exploit outputs for TV, movie and video games and research projects with external partners in creative industries and beyond. This capability will be directly exploited by partners involved in content production (Coney, Figment, Imaginarium, Framestore, DoubleNegative, SuperMassive, BBC) and creative tools (The Foundry, Imagineer, IKinema, DoubleMe). Economic impact will be achieved through both additional production capabilities for new market places and incorporating the novel tools arising from VRSP into new products (such as The Foundry's CARAVR for Nuke, and IKinema LiveAction). Engagement with acting organisations in co-production (Guildford School of Acting (GSA), will ensure that outputs are developed with end user requirements, increasing uptake and helping shape the skillsets of future actors/directors, and informing new curriculums.

One of the core societal impacts of vRSP will be to initiate a rethinking of our means of conducting research between Academia and the Creative Economy. Rather than focusing solely on developing new technologies, vRSP aims to demonstrate that valuable insights can be gathered through close integration of Arts and Science.


10 25 50
Title The Golden Line immersive performance 
Description Interactive VR performance with volumetric audio and video Tony Myatt Concept, sound recording and audio programming Michael Price Concept, art direction, music composition Kirk Woolford Concept, visual direction and visual programming Sam Zlajka Concept, 360 video and audio recording Jon Smart Iskra Strings - 1st Violin James Underwood Iskra Strings - 2nd Violin Laurie Anderson Iskra Strings - Viola Charlotte Eksteen Iskra Strings - Cello Peter Gregson Solo Cello Heloise Werner Solo Soprano 
Type Of Art Artefact (including digital) 
Year Produced 2018 
Impact The creation of this artefact explored existing tools and working methods for the creation of volumetric video and the use of ambisonic audio in immersive productions 
URL http://resounding.place/#!/Credits
Title The Golden Line video 
Description 360 stereo video with ambisonic audio. Tony Myatt Concept, sound recording and audio programming Michael Price Concept, art direction, music composition Kirk Woolford Concept, visual direction and visual programming Sam Zlajka Concept, 360 video and audio recording Jon Smart Iskra Strings - 1st Violin James Underwood Iskra Strings - 2nd Violin Laurie Anderson Iskra Strings - Viola Charlotte Eksteen Iskra Strings - Cello Peter Gregson Solo Cello Heloise Werner Solo Soprano 
Type Of Art Film/Video/Animation 
Year Produced 2018 
Impact The creation of the video tested playback and dissemination forms for ambisonic audio 
URL http://resounding.place
Description The funded period began with the creation of a higher-spec recording of a performance than would normally be possible due to budgetary, equipment and time constraints. Audio recording equipment from The Control Room and the University of Surrey's Institute of Sound Recording were combined to create 3-spatial recordings and a 24-channel mix from the central listening location. The project team made multiple video and interactive outputs testing development and distribution tools for ambisonic (spatial) audio playback.

The next strand of the project focused on recreation of the performance space. This included the creation of a 3D point cloud scan of St Giles Cripplegate Church, where the performance was recorded, as well as 3D stereo video recordings of the musicians and a 3D volumetric recording of the singer. All these elements were brought together with a game engine (Unity3D) to create an interactive performance.

Finally, tools developed by the Institut de Recherche et de Coordination Acoustique/Musique (IRCAM) were combined with the game engine to position the audio recordings in the captured space and allow the audience to freely explore the model, and hear the performance from different locations within the church.

The most significant achievements from the award include:

- The development of a new method of recording and replaying volumetric audio
- Formally testing existing tools for playback and dissemination of ambisonic audio recordings
- Bringing together researchers and leading industry practitioners

All key objectives were met. The project team have successfully secured follow-on funding from the AHRC to continue the volumetric audio and video development with new partners, and to host a public presentation at one of the most respected audio recording studios in the world (Abbey Road) in order to reach industry colleagues interested in taking the findings forward.
Exploitation Route The project demonstrates a novel method of working with spatial audio in an immersive environment. It may lead to new approaches to spatial audio recording and recreation.
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software),Culture, Heritage, Museums and Collections

URL http://resounding.place
Description The industry partners, Michael Price and The Control Room, have developed increased their knowledge and experience recording spatial audio and 360 video. This will become embedded in future recording projects. The project has been awarded follow-on funding to increase future non-academic impact.
First Year Of Impact 2019
Sector Creative Economy,Digital/Communication/Information Technologies (including Software),Leisure Activities, including Sports, Recreation and Tourism,Culture, Heritage, Museums and Collections
Impact Types Cultural,Economic

Description vRSP Follow-on
Amount £32,168 (GBP)
Funding ID AH/S010610/1 
Organisation Arts & Humanities Research Council (AHRC) 
Sector Public
Country United Kingdom
Start 01/2019 
End 05/2019
Description Woolford, K., Myatt, T., Price, M., (2018), virtually (Re)Sounding Place (vRSP), AHRC Next Generation Immersive Experiences Showcase, University of York 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact AHRC Immersive Experiences Showcase - York, Dec 2018

Projects funded by the AHRC and EPSRC to explore the future of immersive experiences were showcased at an event in York in December 2018, hosted by Creative Media Labs and University of York, Department of Theatre, Film and Television.
Year(s) Of Engagement Activity 2018
URL https://ceprogramme.com/immersive-experiences/
Description Woolford, K., Myatt, T., Price, M., (2018), virtually (Re)Sounding Place (vRSP), Audio Day: UK Acoustics Network (UKAN), Audio Engineering Society (AES), Engineering and Physical Sciences (EPSRC), Centre for Vision Speech and Signal Processing (CVSSP), University of Surrey 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact Demonstration presented at the UK Audio Day co-hosted by the UK Acoustics Network (UKAN), Audio Engineering Society (AES), Engineering and Physical Sciences (EPSRC), Centre for Vision Speech and Signal Processing (CVSSP), University of Surrey

This Audio Day brings together researchers and collaborators engaged in audio-related research projects linked to the University of Surrey, including projects on musical audio repurposing using source separation; spatial audio in the home; making sense of sounds; and creative commons audio. Many of these are collaborative projects, including members of the Centre for Vision, Speech and Signal Processing (CVSSP), Institute of Sound Recording (IoSR), Digital World Research Centre (DWRC) and Centre for Digital Economy (CoDE) at Surrey, plus many other partners including the University of Salford, the University of Southampton, BBC R&D, and Audio Analytic.
Year(s) Of Engagement Activity 2018
URL https://cvssp.org/events/audio_day_2018/