Metamorphosis: Audiencing Atmospheres: Transforming Perceptions of Place in a Gesture-Driven Ambisonic Environment.

Lead Research Organisation: Queen Mary University of London
Department Name: Drama

Abstract

This project seeks to produce a new, atmospheric experience of immersive performance involving the motion-capture of participants' gestures as a means of altering the parameters of recorded or synthesised sound across different locations.

Metamorphosis: Audiencing Atmosphere (M:AA) proposes a collaboration between world-class researchers in affect, audience studies, atmosphere and social interaction at QMUL with one of the leading immersive theatre companies in the UK: RIFT ('horribly and impressively intense'; 'exhilarating'; 'beautiful; 'gorgeously expressive'; press reviews for STYX, July 2015). This combination of skills, expertise and insight with access to cutting-edge motion-capture and ambisonic technology equip this collaborative team to push the envelope of sensory, immersive, audience driven performance experience. The planned performance outcome will be compelling, powerful and unlike any work yet seen in the UK. The collaborative process, audience reception and impact will yield data and insights into affect, sound, audience creativity and responsivity, atmospherics, and the parameters of this new technology for creative arts work that will break new ground.

Metamorphosis: Audiencing Atmospheres seeks to take advantage of motion capture, ambisound decoding and data sonification technologies to create new interactive immersive performances across remote locations. This will support London-based theatre company RIFT in the development of narrative, spatial and atmospheric aesthetics for new immersive performances. The project seeks to uncover new possibilities for interaction that make use of gestural responses between participants. Much recent inquiry concerned with gesture in the performing arts and elsewhere assumes intersubjective knowledge based around mirroring, whether cognitive, corporeal or neurological (pace mirror-neurons), informing an empathic, relational account of experience. By contrast, this project looks to capitalize on aesthetic and ecological accounts of human experience that premise subjects' co-presence as it occurs within sensory plenum, such as sound. If both gestural and sound production are, in some degree an action on and within the ambient medium of air, by what technological and aesthetic means might they be brought into correspondence? More specifically, how can we use a participant's gestures in response to ambisonic movement in one location to generate an atmospheric condition in another, and to generate a feedback loop in which they are affected in turn.

Immersive experience in performance has to date tended towards an all-consuming engagement with representation for which the cinematic mise-en-scene remains a lode-stone. By contrast, this project seeks to create and experiment within a sensory plenum, which, whilst equally complete, is less concerned with direct access to a scene or setting. The development and accessibility of digital audio-technologies have led to sound-design and sound-scapes being widespread across differing scales of theatrical production. Arguably however, these tend, primarily, to seek to produce auditory experiences that again draw on the cinematic mise-en-scene - sound is used to make an appeal to the actions and events of an imagined elsewhere. However, sound also has the capacity to draw audiences into a rich, sensory engagement with their immediate situation, as electronic dance music demonstrates. From Lee 'Scratch' Perry to dubstep to Chris Watson, artists have used bass frequencies to engage not only the ear, but also the viscera of audiences. New digital technologies can allow the production of infrasonic frequencies, too low to be heard, but 'loud' enough to be felt. Sound can be used to afford an atmospheric, aesthetic sense of place, as well as its representation.

Planned Impact

Metamorphosis: Audiencing Atmospheres (M:AA) will have immediate benefits to those directly participating in its research, but the project team will share them with a wider set of constituencies:
1. RIFT theatre company will receive direct benefit from the project via: a) the enhancement of their creative process in M:AA's development of new aesthetic experiences in the creation of a gesture-driven ambisonic environment (GAE); b) support in the creation of an immersive performance linking sound and gesture in participatory engagements with place and narrative; c) a framework for developing aesthetic atmospheres in collaboration with audiences; d) a process for drawing remote locations and activities together in a virtual space, and designing a responsive atmospheric aesthetic; e) a longitudinal relationship with audiences, that both supports and feeds into the creative process, but also offers insight into qualitative experience, supporting future development and works; f) the enhancement and development of skills amongst their creative team, including sound designer Thomas Wilson and writers Annie Jenkins and Rebecca Lenkiewicz, through access to technical materials and specialist and academic knowledge otherwise be foreclosed to this scale of artistic production.

2. QMUL researchers will gain access to a 'real world' context within which their research will be embedded. By directly bringing academic expertise and knowledge to bear on a creative process, there will be a clear articulation of 'impact' and critical and empirical thinking will be made accessible to artists in an MSME context, and to audiences. The project will highlight the significance of the co-creative artistic and academic processes in marketing and publicity associated with RIFT's performance Void facillitated by M:AA, and in the longitudinal study of audience experience supporting it. HCI and Theatre and Performance researchers will access qualitative and quantitative data concerning one-to-one audience interactions with aesthetic atmospheres, and gain insight into audience experience of spatial location, and its dynamic and sensory construction in displaced perceptual feedback of sound and movement.

3. Audiences will gain a means of productive agency in interaction with an immersive, atmospheric environment, and an embodied, creative relationship with others elsewhere. They will have an opportunity to be involved in the creative process through a 'Scratch' methodology of curated responses to work-in-progress, and be engaged in a longitudinal survey of experiences leading up to, and following the GAE's testing in RIFT's performance Void at the 2018 Vault Festival.

4. Theatre companies, producers and creative industries organisations will benefit from access to the creative toolkit that M:AA will develop and make available for future R&D. M:AA will host a public sharing to demonstrate the GAE and the atmospheric and interactive environment produced for RIFT's Void, as well as sharing the audience feedback process. QMUL Drama will offer the GAE as a research and development platform allowing MSME theatre and creative industries companies to explore the use of sound and motion capture technologies in the production of immersive experiences.

5. The development of the GAE and its use in RIFT's Void, together with the work with audiences that will support them will be of benefit to a number of different areas of academic research interest, including: affective atmospheres, intermedial performance, immersive theatre, and participatory performance, audiences, sound studies, augmented reality, human computer interaction, interaction psychology, and embodied cognition. Details of the public sharing will be circulated on academic networks, the project's process and findings will be shared in the joint paper written by Welton (PI) and Healey (CI). Details of M:AA's audience research method will be available in ebook form on QMUL Drama's webpages.

Publications

10 25 50
 
Title VOID 
Description VOID was an interactice and immersive theatre performance designed for single audience members attending its separate elements in sequence. Comissioned for the 2018 Vaults festival in Waterloo, the performance provided a developmental context for the design of interactive sound for immersive theatre, and a public format for testing it. Furthermore, as the performance required the movement of audience members from one self-contained immersive environment to another, passing through public space, it also allowed an experimental approach to the recording of audience experience in situ, and a means of mapping the movement or transfer or atmospheric affects. By moving and tracking audience members across differing experiential contexts, the research team were afforded both real-time and reflective insight into their experience of atmospheres of place as transformed within, or in relation to performance design - particularly in respect to sound. 
Type Of Art Performance (Music, Dance, Drama, etc) 
Year Produced 2018 
Impact VOID was awarded the 2018 Vaults festival Innovation award. The production was favourably reviewed by a significant blog - A Younger Theatre 
URL https://www.ayoungertheatre.com/review-void-the-vaults/
 
Description The most significant aspect of this research has been the development of a prototype Gesture-Driven Ambisonic Environment (GAE). This has implications for immersive sound-design, and participatory models for theatre performance. The GAE uses VR and motion-tracking to allow audiences to alter the parameters of the playback of sound-design. This enables them to 'sculpt' the atmospherics of what they listen to by shifting sound location, pitch, frequency and so on, or, in response to changes in sound parameters as they move, to initiate further cues within the design by engaging 'triggers' in a VR which maps over the performance space. As the latter is not visible to the audience, but serves as a tool producing data that enable the design, audiences experience an atmospheric soundscape in direct relation to the real spaces they occupy. In the version produced for RIFT's VOID, this was within a completely darkened space, allowing an aesthetic of immersion in sound in the absence of visual content. Subsequent development of the GAE suggests that exploratory responses to sound-design are equally possible in lit environments. In addition, the VR design is now being developed to allow users in remote locations to collaborate in altering ambisonic playback to trigger narrative cues. This has potential to produce shared experience for geographically separate participants, and to support an approach to participatory/immersive performance that is not premised on 'live' co-presence. An application to Innovate UK's Audience of the Future to pursue this in a follow-on project with Battersea Arts Centre was unsuccessful, but further possibilities are being actively explored.
The project produced a considerable amount of data in relation to audience use and experience of the GAE in the presentation of a play written by Annie Jenkins, and to their affective engagement in the elements of VOID that followed - a walk down a city street to a second location and attending to a second play by Joe Brown. Data produced from movement tracking within the GAE and from GoPro cameras show the details of audience movement and behavioural response. This, together with corresponding interviews have provided further contextual and affective feedback providing expected and unexpected responses. Firstly, and unexpectedly, audiences found the possibility of manipulating sound engaging, but chose to remain still when presented with a content-rich sound design, limiting the opportunities for exploratory interaction. Secondly immersion in sound in the first performance affected audiences' experience of their walk to the second. Despite there being no formal performance content on the street (Lower Marsh) many anticipated or assumed it. This affective overflow also fed into the second play of VOID. Although touching on a similar thematic concern relating to sexual consent, this made use of a 'live' actor, and fully realised visual setting. As well as drawing thematic connections, audiences also experienced a continuation or overflow of the affects of their earlier experiences. The shifting in their atmospheric experience between and amongst performance spaces was expected to a certain degree, but not in the extent to which their reports suggest.
Exploitation Route The partnership of QMUL and RIFT has provided a model for collaborative production between an independent theatre company and an HEI, which can support aesthetic and research outcomes. The GAE - the technological infrastructure this produced has application beyond the immediate production of VOID, as it can be used to support a variety of different actions, and has the potential to trigger not only sound but also lighting and other cues. Further exploration of how it might network remote locations will help a range of different cultural productions - this is being pursued by the research team with Battersea Arts Centre, as well as in relation to dance in discussions with Alexander Whitley Dance Company. A planned sharing for invited creative industry professionals in summer 2018 was not possible owing to staff illness. In October 2019, discussion began with Battersea Arts Centre and The Old Market Theatre directed towards an application for follow-on funding to develop the GAE further.. The research team are working towards publications including (PI) a chapter for the forthcoming Routledge Handbook on Atmospheres drawing directly on the project. As part of the follow-on funding, additional dissemination projects are planned that will make these materials available to researchers and to interested arts and creative industries practitioners
Sectors Creative Economy,Culture, Heritage, Museums and Collections

 
Description The Gesture-driven Ambisonic Environment (GAE), which draws together VR, Ambisonic playback and motion-capture technologies, was used in a research and development process by theatre producers Fuel in July 2022. The PI is now in the early stages of preparing an AHRC follow-on funding bid to support further developments on the basis of this R&D.
First Year Of Impact 2022
Sector Creative Economy
Impact Types Cultural

 
Description VOID - immersive theatre peformance with RIFT 
Organisation RIFT
Country United Kingdom 
Sector Private 
PI Contribution Welton (PI) - I provided project management and dramaturgical oversight to the development of a new work of immersive theatre performance made by RIFT for the 2018 Vaults festival, and contributing to the design and implementation of an audience research study embedded within it. The former involved working with the company and with QMUL Drama's technical manager to design and install a Gesture-Driven Ambisonic Environment (GAE) within the context of a professionally commissioned theatre performance. The latter involved a design process with Healey and Woods (CIs) embedded within the production process, and which could be carried out with minimal disruption to the performance per se.
Collaborator Contribution RIFT secured a professional and therefore 'real world' context for the design and testing of the various elements of the project. Directors Felix Barrett and Joshua Nawras commissioned two writers - Annie Jenkins and Joe Kerridge - to write two interlinked plays to be presented for solo audience members in shipping containers situated at either end of Lower Marsh market in London's Waterloo district as part of the Vaults festival. The first play, by Jenkins took place in total darkness with audiences making use of the GAE to manipulate the parameters of a sound design commissioned by RIFT from Tom Wilson. Audiences then walked to the second container for Kerridge's play. Although thematically connected with Jenkins' play, this second work sought to present a more 'daily' environment, albeit one that was unsettled by the previous immersion in Wilson's soundscape, and the connecting walk down Lower Marsh. Exploring themes of sexual consent and misogyny, the performance and its constituent plays and design sought to unsettle observer positions. Barret and Nawras worked closely with the research team in exploring the formal and aesthetic possibilities of linking ambisonic playback with either captured or tracked gesture and movement in the first shipping container where Jenkins' play was staged. They also directed the acting performances recorded for, or staged in each container; in the case of Kerridge's play a series of repeated performances by the actor Joe Brown. Barrett and Nawras negotiated the contract for the work with the Vault Festival, the licensing for, and the the installation of the shipping containers, the fitting of set and technical equipment within them, and the schedule and programming of its stage management. This involved up to 15 overlapping performances on each evening of the run between the 14th February and the 18th March 2018, with the stage management greeting and guiding audiences at and between each element of the work.
Impact Performance of VOID at 2018 Vaults Festival
Start Year 2017
 
Title Sound and movement data syntheses 
Description The project developed a Gesture-Driven Ambisonic Environment which drew together a number of existing hardware devices and softwares in an original combination. Audience members accessed the GAE by taking an HTC Vive controller into the performance space, the position of which was tracked using a Windows computer running Unreal, which allowed us to create a VR space that mapped the physical performance environment in high definition. Unreal was used to host a UE4-OSC plugin, that translates the movement data into an OSC (Open Sound Control) protocol. OSC data was then transmitted via a Local Network, to a Macbook Pro and received by OSCulator software. This translates OSC data into MIDI, which in turn triggers QLab software, and sends live automation data to another sound-design software, Reaper. QLab was then used to trigger audio playback in Reaper, a digital audio production application, which allowed for a regular and reliable looping of the audio. Reaper hosted pre-recorded and edited Ambisonic audio. This was encoded into B Format, and then decoded to 9 loudspeakers in the performance space, by a series of plugins produced by Blue Ripple Sound studio. Movement data generated by the Vive controller (after it was been processed by Unreal and OSCulator) was used to control various parameters of the plugins hosted in Reaper. This allowed for control of volume, spatial positioning of audio, and frequency of oscillators on simple synthesisers, from the movement of an audience member inside the performance environment. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2017 
Impact The development of the GAE allowed a responsive sound design to be installed in RIFT's performance of VOID in the 2018 Vaults festival. Since then it has allowed the research team to begin discussions with other creative producers including Battersea Arts Centre for its further use and development in the production of interactive immersive theatre performance.