vRSP Follow-on
Lead Research Organisation:
University of Surrey
Department Name: Media
Abstract
The existing vRSP project explores new, responsive, immersive, and interactive methods of experiencing a recorded performance, as well as allowing users to explore the unique heritage places for which these performances have been created. The existing vRSP network has created a prototype demonstrator of a new form of immersive site-specific music performance. In the existing prototype, a solo singer, moves through the space. Currently, the audience can only rotate around to watch and listen to this singer. If audience members move too close to the singer, the illusion breaks down, due to the quality and resolution of the capture. This follow-on bid proposes to work with a new industry partner, Dimension Studios, to polish this demonstrator to the highest possible standard in order to allow audiences to experience the ability to physically explore a virtual space and walk around inside a live music performance, giving audience a more direct experience than currently possible
One of the most experimental aspects of the existing vRSP project has been the integration of 360 video, mapped environment capture, performance, sound and volumetric multi-dimensional (aka holographic) video. The latter aspect has only recently become a viable 'audience proposition', as the recreation of a realistic yet synthesised human performer needs to adequately bridge the 'uncanny valley' to create a believable and acceptable experience for the viewer/listener/interactor. Indeed, one of the few examples of this integration so far is the Hold the World project for Sky/Natural History Museum featuring a virtually recreated David Attenborough demonstrating interactive animated exhibits in person and in vivo. Successful volumetric video capture could only be achieved in this project with the full power of a large consortium of professional production partners, plus the full technological might of Microsoft's volumetric capture studios in Redmond and their UK incarnation at Dimension Studios in Wimbledon. To do this process to an general audience-ready standard is time-consuming, complicated and resource intensive. For the initial phase of VRSP, we needed to experiment with the mix of immersive media and see if the overall impact was as hoped, particularly in terms of the harmonisation of imageries (and sound). We therefore adopted a 'proof of concept' approach to 4D volumetric capture using resources within University of Surrey's CVSSP to create a low resolution proxy of the final result. Having done this, we saw that the overall effect was absolutely worth pursuing, but only with the higher 'professional' resolution version of the capture, which was at that point beyond our means and scope. The partners have agreed that to make this mass audience-ready, we need to recreate the initial capture using the kind of technological resources which were available to the Hold the World project, which are fortunately relatively close at hand within Dimension Studios, who we have now signed up as a project contributor. The network can now massively benefit from their expertise, and with their input we an ensure that the outputs are a veritable worthy match of Price's amazing musical tapestries, and a great, high production value audience experience.
One of the most experimental aspects of the existing vRSP project has been the integration of 360 video, mapped environment capture, performance, sound and volumetric multi-dimensional (aka holographic) video. The latter aspect has only recently become a viable 'audience proposition', as the recreation of a realistic yet synthesised human performer needs to adequately bridge the 'uncanny valley' to create a believable and acceptable experience for the viewer/listener/interactor. Indeed, one of the few examples of this integration so far is the Hold the World project for Sky/Natural History Museum featuring a virtually recreated David Attenborough demonstrating interactive animated exhibits in person and in vivo. Successful volumetric video capture could only be achieved in this project with the full power of a large consortium of professional production partners, plus the full technological might of Microsoft's volumetric capture studios in Redmond and their UK incarnation at Dimension Studios in Wimbledon. To do this process to an general audience-ready standard is time-consuming, complicated and resource intensive. For the initial phase of VRSP, we needed to experiment with the mix of immersive media and see if the overall impact was as hoped, particularly in terms of the harmonisation of imageries (and sound). We therefore adopted a 'proof of concept' approach to 4D volumetric capture using resources within University of Surrey's CVSSP to create a low resolution proxy of the final result. Having done this, we saw that the overall effect was absolutely worth pursuing, but only with the higher 'professional' resolution version of the capture, which was at that point beyond our means and scope. The partners have agreed that to make this mass audience-ready, we need to recreate the initial capture using the kind of technological resources which were available to the Hold the World project, which are fortunately relatively close at hand within Dimension Studios, who we have now signed up as a project contributor. The network can now massively benefit from their expertise, and with their input we an ensure that the outputs are a veritable worthy match of Price's amazing musical tapestries, and a great, high production value audience experience.
Planned Impact
VR is a major opportunity for the UK's world leading creative industry --- already worth £84.1 billion and employing 2.8M people --- with investment of $4bn in immersive displays and a predicted global market size of $150bn for AR/VR content by 2020. vRSP collaborators, including creative leaders in Film, Broadcast, Video Games and Music, are ideally placed to exploit research advances for commercial, economic and cultural impact. Technologies for the creation of engaging digital actors with autonomous interactive behaviour will have widespread impact beyond VR/AR for video games, film production, education, training and simulation. To realise impact on the creative industries vRSP has established a strong stakeholder partner group with leading expertise from live performance, 3D Audio to VR production and creative tools.
As an intrinsically collaborative, co-created research network, vRSP anticipates developing a series of outputs which - individually and as a whole - will generate significant wider impact. Our impact strategy, has initially identified independent impact opportunities spanning societal, commercial and academic sectors, with our project partner helping to maximise opportunities. Our impact strategy will be re-assessed at the beginning and end of the project to ensure it meets both our research objectives and partner/wider industry need. Impacts and successes will be reported to the AHRC and vRSP partners as they develop. Our strategy will focus on 1) how best to identify and realise accessible benefit for vRSP partners from the creative tools techniques introduced, 2) implementation of experimental co-production of 3 prototypes with vRSP partners to demonstrate first-hand the step change in capability for VR/AR and immersive audio content, and 3) to disseminate these prototype through the partner's academic and industry networks to gather feedback and identify opportunities for future research. These core goals will be facilitated through a hands-on workshop with stakeholders, presentations to AHRC and at the culminating G3 Futures games conference. The prototypes will also be submitted submitted to major film/VR festivals (Caanes Marché du Film NEXT, Sundance NEXT, TriBeca Film Festival, Ars Electronica) and creative industry forums (SIGGRAPH, FMX, IBC). These presentations as well as network discussions will be used to gather evaluation feedback for inclusion in final publications, which will be submitted to music, digital arts, and VR publications.
The inclusion of the new partner, Dimesion Studios will extend the existing vRSP network focused on Audio research, into the visual domain, and create new links into the Digital Catapult network. This will greatly extend the UK reach and visibility of the project. The new volumetric video assets will be integrated into the existing prototype. This will require Dimension Studio's engineers to work directly with the vRSP researchers in order to exchange image and sound experience and techniques. This will benefit Dimension Studios as much as the vRSP network.
The new version of the experience will be presented to industry partners and invited guests at Abbey Road Studios in London. This is a professional setting, and one of the most highly respected recording studios in the world. Presenting the work at Abbey Road will have a much greater impact than presenting it on a university campus, and is more likely to reach future partners and distributors.
One of the core societal impacts of vRSP will be to initiate a rethinking of our means of conducting research between Academia and the Creative Economy. Rather than focusing solely on developing new technologies, vRSP aims to demonstrate that valuable insights can be gathered through close integration of Arts and Science.
As an intrinsically collaborative, co-created research network, vRSP anticipates developing a series of outputs which - individually and as a whole - will generate significant wider impact. Our impact strategy, has initially identified independent impact opportunities spanning societal, commercial and academic sectors, with our project partner helping to maximise opportunities. Our impact strategy will be re-assessed at the beginning and end of the project to ensure it meets both our research objectives and partner/wider industry need. Impacts and successes will be reported to the AHRC and vRSP partners as they develop. Our strategy will focus on 1) how best to identify and realise accessible benefit for vRSP partners from the creative tools techniques introduced, 2) implementation of experimental co-production of 3 prototypes with vRSP partners to demonstrate first-hand the step change in capability for VR/AR and immersive audio content, and 3) to disseminate these prototype through the partner's academic and industry networks to gather feedback and identify opportunities for future research. These core goals will be facilitated through a hands-on workshop with stakeholders, presentations to AHRC and at the culminating G3 Futures games conference. The prototypes will also be submitted submitted to major film/VR festivals (Caanes Marché du Film NEXT, Sundance NEXT, TriBeca Film Festival, Ars Electronica) and creative industry forums (SIGGRAPH, FMX, IBC). These presentations as well as network discussions will be used to gather evaluation feedback for inclusion in final publications, which will be submitted to music, digital arts, and VR publications.
The inclusion of the new partner, Dimesion Studios will extend the existing vRSP network focused on Audio research, into the visual domain, and create new links into the Digital Catapult network. This will greatly extend the UK reach and visibility of the project. The new volumetric video assets will be integrated into the existing prototype. This will require Dimension Studio's engineers to work directly with the vRSP researchers in order to exchange image and sound experience and techniques. This will benefit Dimension Studios as much as the vRSP network.
The new version of the experience will be presented to industry partners and invited guests at Abbey Road Studios in London. This is a professional setting, and one of the most highly respected recording studios in the world. Presenting the work at Abbey Road will have a much greater impact than presenting it on a university campus, and is more likely to reach future partners and distributors.
One of the core societal impacts of vRSP will be to initiate a rethinking of our means of conducting research between Academia and the Creative Economy. Rather than focusing solely on developing new technologies, vRSP aims to demonstrate that valuable insights can be gathered through close integration of Arts and Science.
Title | The Golden Line immersive performance |
Description | Interactive VR performance with volumetric audio and video Tony Myatt Concept, sound recording and audio programming Michael Price Concept, art direction, music composition Kirk Woolford Concept, visual direction and visual programming Sam Zlajka Concept, 360 video and audio recording Jon Smart Iskra Strings - 1st Violin James Underwood Iskra Strings - 2nd Violin Laurie Anderson Iskra Strings - Viola Charlotte Eksteen Iskra Strings - Cello Peter Gregson Solo Cello Heloise Werner Solo Soprano |
Type Of Art | Artefact (including digital) |
Year Produced | 2019 |
Impact | The creation of this artefact explored existing tools and working methods for the creation of volumetric video and the use of ambisonic audio in immersive productions |
URL | http://resounding.place/#!/Credits |
Title | The Golden Line video |
Description | 360 stereo video with ambisonic audio. Tony Myatt Concept, sound recording and audio programming Michael Price Concept, art direction, music composition Kirk Woolford Concept, visual direction and visual programming Sam Zlajka Concept, 360 video and audio recording Jon Smart Iskra Strings - 1st Violin James Underwood Iskra Strings - 2nd Violin Laurie Anderson Iskra Strings - Viola Charlotte Eksteen Iskra Strings - Cello Peter Gregson Solo Cello Heloise Werner Solo Soprano |
Type Of Art | Film/Video/Animation |
Year Produced | 2019 |
Impact | The creation of the video tested playback and dissemination forms for ambisonic audio |
URL | http://resounding.place |
Title | The Golden Line volumetric performance (2019) |
Description | The Golden Line Volumetric performance builds on the original Golden Line performance with the development of a volumetric audio engine, capable of mixing audio from multiple ambisonic sound fields. The full performance allows the audience to move around the volumetric scan of St Giles' Cathedral, and listen to the performance from different locations. |
Type Of Art | Artefact (including digital) |
Year Produced | 2019 |
Impact | The development of this project brought together multiple academic and industry partners (Dimension Studios and Control Room) for the first time. |
URL | https://b.bhaptic.net/vrsp/ |
Description | The research team successfully captured volumetric video for the project at Dimension Studios and integrated it into the project. A new volumetric audio system has been created, which is capable of live mixing from multiple ambisonic sound fields. |
Exploitation Route | The project demonstrates a unique manner of using multiple sound fields in order to replicate sound in a virtual environment. This can be used to generate greater immersion in a virtual experience. |
Sectors | Creative Economy Digital/Communication/Information Technologies (including Software) Culture Heritage Museums and Collections |
Description | This award led to the creation of new volumetric video assets created with Dimension Studios (a division of Hammerhead Interactive) in Spring of 2019. A new immersive performance was created for both desktop VR and the Oculus (now Meta) Quest, with live performances planned for early 2020. Unfortunately, the PI was made redundant in June of 2019, and the Covid-19 lockdowns in 2020 made it impossible to present the work to a live audience. The work has now been presented to new audiences through the "Art//Tech//Play" collaboration in partnership with Anglia Ruskin University, Norwich University of the Arts, Collusion Cambridge, Cambridge Junction, Dance East, and Norwich Arts Centre. |
First Year Of Impact | 2019 |
Sector | Creative Economy,Culture, Heritage, Museums and Collections |
Impact Types | Cultural |
Description | Art//Tech//Play Open Day, Anglia Ruskin |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | Open presentation to arts practitioners and curious audiences as part of the Arts Council England-funded Art//Tech//Play event: The ART // TECH // PLAY programme supports artists and creatives to build confidence and knowledge of creative technologies in a practice-led way, leading to the development of new creative practice, ideas, and networks. Any artists in any artform, at any stage of their career, or level of experience with tech can get involved. Join us at our EVENTS online and in-person to hear from other artists, get hands-on with tech, or join an introductory workshop. VIDEOS provide artist-led perspectives on how they are using various tech in their practice. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.collusion.org.uk/art-tech-play/ |
Description | Art//Tech//Play presentation at Norwich Theatre |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | Presentation and discussion of project to performing arts professionals as part of the Art//Tech//Play mentoring programme in partnership with Collusion Cambridge, Norwich University of the Arts, Norwich Theatre, The Junction Cambridge, Norwich Arts Centre, and Dance East. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.collusion.org.uk/art-tech-play/ |