SADIE: Spatial Audio for Domestic Interactive Entertainment

Lead Research Organisation: University of York
Department Name: Theatre Film and TV

Abstract

Interactive media systems such as games consoles have become commonplace in UK domestic environments offering increased connectivity and media integration. Whilst many systems support 3-D visuals and Ultra-High Definition video, most do not address the importance of 3-D sound for immersive experiences. 5.1 surround is the most common format supported by such systems, but its immersive capability is limited to a two-dimensional plane. New formats incorporating height channels have been targeted at cinema but could also naturally complement the virtual reality experience of gaming by extending the soundfield to three dimensions. The benefits not only include new audio features for enhanced gameplay, but also enable the design of immersive sound-centric games with significant social, cultural, educational and healthcare gains. However, sound immersion is difficult to achieve in the home as it is impractical to have dozens of transducers placed around the living room. SADIE pioneers new approaches for rendering interactive spatial sound in the home that includes sound source rendering with height.

For gaming, there is also the complex issue of listener position. For example, in games based on kinetic motion tracking the listener moves in reaction to the game play and the formation of a stable soundfields becomes difficult as they are no longer located at the acoustic sweet-spot. Real-time soundfield compensation through tracking of the listener can be used to counteract this, but the listener movement also makes digital equalisation of the room acoustics a considerable challenge. The result is a flawed virtual environment in which visual and auditory cues are not spatially coincident.

SADIE will address the improvement of spatial audio quality for immersive interactive media experiences in the home. It will undertake novel science to improve soundfield immersion and the formation of 3-D sound sources beyond the horizontal plane, lifting the constraints of loudspeaker placement and dynamic source-listener movements whilst conserving good sound reproduction quality. The research will pioneer new methods for soundfield rendering formed through characterisation of the cues required for perception of sources with height in dynamic listening. The work is poised to have significant transformative impact on sound reproduction in homes in the UK, whilst also addressing wider questions on auditory perception and acoustic signal processing that serve other research disciplines outside of interactive media.

Planned Impact

The SADIE programme of research will impact upon:

The Interactive Media Industry:
The scientific outcomes have the potential for commercialization and knowledge transfer with (i) Hardware and Software audio systems producers (E.g. Headphone/Loudspeaker producers, Virtual Studio Technology producers) and Games Console manufacturers, (ii) Middleware game audio producers, and (iii) Games studios. Adoption of the research outcomes by the gaming industry will transform immersive sound capability in games. Whilst this project focuses on sound reproduction in the home, it also facilitates the development of new games which ultilise full 3-D surround sound for tracked players. This enables new sound features which can be exploited for market potential such as improved situational awareness, use of 3-D audio game cues to reduce user interface 'clutter' and improved inter-player dialogue via spatial positioning in networked games. The financial cost of creating a start-up company or employing knowledge transfer of the research into games companies is low in comparison to the revenue generated by successful games enterprises.

The Broadcast Industry:
Due to the logic of media convergence, games consoles offer increased media integration allowing viewers to not only play interactive games with local and remote players, but also to watch movies and television and use internet services. Consequently the broadcast industry is undergoing a significant change from video-based content delivery to multi-platform content. This means that immersive technology which benefits the games industry can also be used to enhance broadcast facilities in the home. SADIE will enable new 3-D sound formats aimed at cinematic reproduction which include height channels, such as 22.2 surround sound or Dolby Atmos, to now be presented to domestic viewers. The broadcast and cinematic industries are actively researching new methodologies for creating immersive media content for both theatres and the living room. Consequently, the British Broadcasting Corporation are a partner on this project, in an effort to advance spatial sound reproduction to support broadcast of immersive content to UK homes.

Game Players and Society:
Improved soundfield immersion and the discrimination of 3-D sound source locations beyond the horizontal plane will lead to enriched gaming experiences and creates the potential for new sound reproduction cues to enhance gameplay, not just for entertainment value, but also to support serious games with learning outcomes. The average child up to 21 years of age spends over 10,000 hours playing games meaning there is a 'cognitive surplus' which can be tapped into to increase the learning potential of digital games. Indeed, digital games have the capability to go beyond mere entertainment value and touch on deeper societal issues. Immersive 3-D audio can therefore help in the development of games aimed at providing deep cultural (e.g. interactive concert experiences), social (e.g. e-Learning games between local and remote classrooms) and even therapeutic learning outcomes (e.g. 3-D auditory stimulation for listening and learning disabilities). Immersive 3-D sound also enables the creation of enhanced audio-only games designed for the vision impaired.

Academics in other disciplines:
As outlined in the Academic Impact section, the proposed psychoacoustic and signal processing research will also have significant impact in other academic disciplines that utilise interactive audio technologies. These include psychology, computer science, music, and biology amongst many others. The psychoacoustic research fits into the EPSRC area Vision, Hearing and Other Senses and the scientific outcomes are well placed to make a larger impact to the wider ICT research portfolio.
 
Description The main findings from the SADIE project are
- Fully immersive motion tracked 3-D audio is possible using a 4-channel Ambisonic/transaural hybrid system.
- Effective height reproduction for binaural processing in transaural systems can be achieved using low-order IIR approximation of HRTF (for real-time rendering) with a filter order as low as 11.
- It is possible to achieve motion tracked transaural reproduction with the same level of localisation accuracy as found in motion tracked binaural reproduction when individualised head-related transfer functions are used.
- There is no change in the perception of auditory depth with real sound source presentations at different angles to the head.
- 1st Order Ambisonics is sufficient for perception of auditory depth at all angles to the head.
- There is no significant difference in the perception of moving sources for binaural presentations between 1st, 3rd and 5th order Ambisonics
- There are significant timbral differences in Ambisonic mixes presented over different loudspeaker configurations
- The perception of auditory height is poor in 1st and 3rd order Ambisonics in comparison to 5th order.
Exploitation Route Enhanced immersive audio experiences will have a transformative impact on gameplay, not only improving naturalness and integration with UHD displays but also enabling new features including improved situational awareness, use of 3-D audio game cues to reduce visual interface 'clutter', improved inter-player dialogue via spatial positioning in networked games and enriched games design for the visually impaired and cognitive therapy applications. The findings from this project can be utilised in all such areas by technology manufacturers as well as sound designers of VR games, serious games and gamification services that can exploit the scientific outputs. The research findings are also of interest to other working in the wider field of spatial audio.
Sectors Aerospace, Defence and Marine,Communities and Social Services/Policy,Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Electronics,Environment,Healthcare,Leisure Activities, including Sports, Recreation and Tourism,Culture, Heritage, Museums and Collections

URL http://www.sadie-project.co.uk
 
Description The datasets of binaural filters and Ambisonic decoders measured for this research are made publically available under an Apache license on the project website (http://www.sadie-project.co.uk). HRTFs from 18 human subjects and two binaural heads were measured using a custom built motion-tracked laser-guide measurement rig. The database was further updated to a higher resolution set of measurements and new Ambisonic decoders in 2018. As a result these datasets have been adopted in a wide range of fields that utilise VR technologies. Most noticeably they have being adopted by Google as part of their immersive VR pipeline. This includes YouTube 360 as well as applications developed for Google cardboard and the Google Daydream VR headset. Millions of users worldwide who utilise Google's VR platform are therefore using the York SADIE binaural filters, making them the new standard for spatial audio quality in VR productions. Consequently the work has major impact on improved audio quality in VR productions worldwide and any developed VR application that has cultural, societal or economic impact.
First Year Of Impact 2016
Sector Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Healthcare,Leisure Activities, including Sports, Recreation and Tourism,Culture, Heritage, Museums and Collections
Impact Types Cultural,Societal,Economic

 
Description Abbey Road Spatial Audio Forum
Geographic Reach Multiple continents/international 
Policy Influence Type Membership of a guideline committee
 
Description Audio for New Realities Initiative - Audio Engineering Society
Geographic Reach Multiple continents/international 
Policy Influence Type Membership of a guideline committee
 
Description Arts and Humanities Research Council Grant, Research Grants (Early Career)
Amount £121,136 (GBP)
Funding ID AH/N003713/1 
Organisation Arts & Humanities Research Council (AHRC) 
Sector Public
Country United Kingdom
Start 02/2016 
End 09/2017
 
Description EPSRC Impact Acceleration Account Innovation Voucher
Amount £7,132 (GBP)
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Academic/University
Country United Kingdom
Start 02/2018 
End 03/2018
 
Description Google Faculty Research Award
Amount $29,550 (USD)
Organisation Google 
Sector Private
Country United States
Start 04/2016 
End 04/2017
 
Description Google VR Research Award
Amount £84,291 (GBP)
Organisation Google 
Sector Private
Country United States
Start 05/2018 
End 05/2019
 
Description Huawei Innovation Research Partnership
Amount £94,500 (GBP)
Funding ID H02016050002B2 
Organisation Huawei Technologies 
Sector Private
Country China
Start 01/2017 
End 12/2017
 
Title Ambisonic Decoders for multiple loudspeaker systems 
Description The SADIE project uses an Ambisonic framework for its signal processing. As part of this, we have compiled a set of decoders for various spherical loudspeaker configurations that would be useful to the spatial audio research community, and also can be utilised with our binaural database. Ambisonic decoders up to 5th Order for different loudspeaker configurations are given. The config files can be used with compatible VST plugins that use the AMBIX format. The decode matrices are presented in ACN/SN3D format. 
Type Of Material Database/Collection of data 
Year Produced 2016 
Provided To Others? Yes  
Impact The decode config files can be used with compatible VST plugins for audio production. This makes immersive 3-D binaural sound more accessible from a content creation perspective for Games developers, filmmakers, VR developers, musicians and artists. 
URL http://www.york.ac.uk/sadie-project/ambidec.html
 
Title SADIE Binaural database 
Description High-resolution head-related transfer function (HRTF) databases have become more prevalent of late, with new datasets offering increased angular accuracy and spatial resolution of measurements. However, for optimal virtual loudspeaker decoding for Ambisonics binaural research, quite often interpolation of HRTF measurements is required in order to achieve loudspeaker positions on the sphere optimal for decoder design, which is undesirable. We present a new online database of HRTFs derived for 1st to 5th order full sphere Ambisonic reproduction. We also implement factorisation of the datasets such that source material intended for binaural reproduction can be rendered using directionally independent pre-filtering, leaving shorter run-time filters for computational ease. 
Type Of Material Database/Collection of data 
Year Produced 2016 
Provided To Others? Yes  
Impact The binaural database can be used with compatible VST plugins for audio production. This makes immersive 3-D binaural sound more accessible from a content creation perspective for Games developers, filmmakers, VR developers, musicians and artists. The database can also be used by the academic community to study spherical acoustics for multiple listeners. 
URL http://www.york.ac.uk/sadie-project/binaural.html
 
Description Abbey Road Spatial Audio Forum 
Organisation Abbey Road Studios Ltd
PI Contribution Knowledge of audio recording and production for Virtual Reality; Creation of VR recordings at Abbey Road that showcase immersive audio research; Collaboration on AES paper on 'Multichannel Microphone Array Recording for Popular Music Production in Virtual Reality' at AES Convention New York, October 2017; Contribution to the Abbey Road Spatial Audio forum network.
Collaborator Contribution Knowledge of audio recording and production for popular music production; Use of recording facilities and studios; Creation of VR recordings at Abbey Road that showcase immersive audio research; Collaboration on AES paper on 'Multichannel Microphone Array Recording for Popular Music Production in Virtual Reality' at AES Convention New York, October 2017; Set up of the Abbey Road Spatial Audio forum network.
Impact All collaborations listed are multidisciplinary with expertise from music production, recording, physics, and signal processing. Outcomes are: Conference paper: Multichannel Microphone Array Recording for Popular Music Production in Virtual Reality, Riaz, Hashim; Stiles, Mirek; Armstrong, Cal; Chadwick, Andrew; Lee, Hyunkook; Kearney, Gavin, AES Convention 143, October 2017. Online Immersive Audio Demonstrations: http:\\www.sadie-project.co.uk\AbbeyRoadRecordings.html; Convention workshop: 'Immersive Audio for VR', AES Convention 143, October 2017.
Start Year 2017
 
Description Improved Immersive Audio for VR and AR 
Organisation Google
Country United States 
Sector Private 
PI Contribution In support of research into Ambisonic based binaural surround sound for Virtual Reality, datasets of binaural filters, measured as part of the EPSRC funded Spatial Audio for Domestic Interactive Entertainment (SADIE) project (EP/M001210/1), have been adopted by Google as part of their immersive VR pipeline. This includes YouTube 360 as well as applications developed for Google cardboard and the Google Daydream VR headset. Millions of users worldwide who utilise Google's VR platform are therefore using the York SADIE binaural filters, making them the new standard for spatial audio quality in VR productions.
Collaborator Contribution Google have investigated the overall perceived quality of SADIE filters in motion-tracked binaural reproduction for virtual reality (VR) applications. 50 subjects were asked to assess binaural renders of music, nature and synthetic sounds for 20 diverse HRTF sets within Ambisonic reproduction frameworks. The SADIE filters were found to be unanimously favoured above the current Google filters. The SADIE binaural filters have since been directly integrated into the VR production pipeline affecting the audio rendering at all stages of VR content creation.
Impact Integration of SADIE filters into the Google spatial audio pipeline; Qualitative assessment of spatial audio quality of SADIE filters conducted and published online; Use of SADIE filters to millions of users worldwide through Google integration into YouTube, Chrome browser and other Google services as well as the audio SDK.
Start Year 2015
 
Title Spatial Audio for Domestic Interactive Entertainment (SADIE) website 
Description A website has been created for the SADIE project to document research outcomes, house databases arising from the project and to advertise public engagement events, workshops and seminars. 
Type Of Technology Webtool/Application 
Year Produced 2015 
Impact The website has enabled industry and academics to engage with the outputs of the project. 
URL http://www.sadie-project.co.uk
 
Description 3D Auralisation And Virtual Reality Technology in Digital Heritage, Moesgaard Museum and Aarhus University, Denmark, 11th June 2015 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Third sector organisations
Results and Impact '3D Auralisation And Virtual Reality Technology in Digital Heritage'
Damian Murphy, Jude Brereton, Andrew Chadwick, Helena Daffern, Gavin Kearney, William Smith & Alex Southern
Digital Heritage 2015, Moesgaard Museum and Aarhus University, Denmark, May 2015
Year(s) Of Engagement Activity 2015
URL http://conferences.au.dk/fileadmin/conferences/2015/Digital_Heritage/Program_2015_-_FINAL.pdf
 
Description Binaural Processing of Ambisonics, University of Huddersfield, February 3rd, 2016. 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact Binaural Processing for Ambisonics: Analysis, Measurement and Reproduction
Dr. Gavin Kearney, February 3rd, 2016, University of Huddersfield.

This was an invited talk by the Applied Psychoacoustics Group, led by Dr. Hyunkook Lee at the University of Huddersfield. The seminar showcased techniques for measuring and rendering binaural sound as well as the current work on the SADIE project.
Year(s) Of Engagement Activity 2016
 
Description Digital Creativity Labs Launch Event, University of York, 6th April, 2016. 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact Wednesday 6th April saw the launch of the Digital Creativity Labs (DC Labs), a major (£18 million) investment by three UK research councils, four universities, and over 80 collaborative partner organisations to create a world centre of excellence for impact-driven research, focusing on digital games, interactive media and the rich space where they converge. The main DC Labs site is York, with "spoke" sites at the Cass Business School, Goldsmiths (University of London) and Falmouth University. Participants at the launch event had a chance to learn more about the exciting activities of the Digital Creativity Labs and its partner institutions. Amongst the demonstrations and 'buzz talks' at the event was the SADIE project, demonstrating immersive motion tracked transaural sound for interactive gaming. Attendees were very interested in the developed technology and how it could be used in their business, research and creative practice.
Year(s) Of Engagement Activity 2016
URL https://www.york.ac.uk/news-and-events/news/2016/campus/digital-creativity-labs/
 
Description Immersed in the UK, Digital Catapult, London, Sept. 20, 2016. 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Immersed in the UK
September 20, 2016 @ 5:00 pm - 9:00 pm, Digital Catapult London.

Abstract:
By February 2016 investment in virtual reality and other immersive technologies had already exceeded $1.1bn, and this is an industry that looks unlikely to slow down anytime soon. A fair share of investment in immersive technologies has been down to entertainment and marketing, but recent advances such as Dr Shafi Ahmed, surgeon and Co-founder of Medical Realities broadcasting the first ever VR surgery prove that the technology is broadening its horizons quickly.

SMEs, academics and investors working in immersive technologies are welcome to attend this event. It's a fantastic opportunity to find out what the UK's leading academic and research organisations and businesses are doing in this space. By attending, you'll also gain exclusive insights into market opportunities, challenges and exciting advances.

David Swapp, Immersive VR Lab Manager, University College London
Mark Sandler, Director of Centre for Digital Music, Queen Mary University of London
Gavin Kearney, Lecturer in Audio & Music Technology, University of York
Eammon O'Neill, Co-Director of CAMERA and Head of the Department of Computer Science at the University of Bath
Mark Skilton, Professor of Practice, Warwick Business School
Dave Haynes, Investor, Seedcamp
Humphrey Hardwicke, Creative Director, Luminous Group
Steve Dann, Co-founder of Medical Realities
Phil Channock, Marketing Manager, Draw and Code

In collaboration with Digital Catapult Centre Brighton, Digital Catapult Centre Northern Ireland and VRTGO Labs

This event was part of ColLab Fest (19-23 September), a week of events sourced from Digital Catapult's network of five centres, nine associates and numerous partners. It was an exciting programme of events exploring the strengths and nuances of the UK's digital economy.
Year(s) Of Engagement Activity 2016
URL https://www.digitalcatapultcentre.org.uk/event/immersed-uk/
 
Description Immersive Audio for Virtual Reality, AES 140th Convention, Paris, 4th June 2016. 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Other audiences
Results and Impact Audio Engineering Society Convention, Paris 2016
Workshop W2

Saturday, June 4, 10:30 - 12:00 (Room 352A)
W2 - Immersive Audio for Virtual Reality
Chair:
Gavin Kearney, University of York - York, UK
Panelists:
Jamieson Brettle, Google - Mountain View, California
Pedro Corvo, Sony Computer Entertainment - London, UK
Marcin Gorzel, Google - Dublin, Ireland
Jelle van Mourik, Sony Computer Entertainment - London, UK; University of York - York, UK

Abstract:
In recent years, major advances in gaming technologies, such as cost-effective head-tracking and immersive visual headsets have paved the way for commercially viable virtual reality to be delivered to the individual. Now the consumer finally has the opportunity to experience new gaming, cinematic and social media experiences with truly immersive and interactive 3-D audio and video content.

For many sound designers, rendering a truly dynamic and spatially coherent mix for VR presents a new learning curve in soundtrack production. What spatial audio techniques should we be using to create engaging and interactive 3-D mixes? What audio workflows should we employ for similar immersive experiences on headphones, 5.1 loudspeakers and beyond? Are new VR production methods backwards compatible with existing game audio pipelines? Can binaural reproduction work for everyone?

In this workshop our panel of experts will present practical workflows for mixing and rendering 3-D sound for VR. The workshop will explore different production techniques for creating immersive mixes such as Ambisonics processing and Head-Related Transfer Function rendering. It will also explore the importance of environmental rendering for VR as well as outlining workflow challenges and pipelines for dynamic spatial audio over a variety of VR technologies and applications.


Approximately 200 people attended the event, which showcased live binaural demos from Google, Sony, Facebook (Two Big Ears) and the SADIE project. There were short presentations and demos from each of the panel members followed by a Q and A session after.
Year(s) Of Engagement Activity 2016
URL http://www.aes.org/events/140/workshops/?ID=4887
 
Description Immersive VR: When will audio catch up?, Article in IDGCONNECT, September 13, 2016 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact Article in IDGconnect looking at spatial audio in virtual reality.
Kearney was interviewed based on expertise in spatial audio and involvement in SADIE project.
Article resulted in numerous queries about research and made public aware of research challenges facing SADIE project.
Year(s) Of Engagement Activity 2016
URL http://www.idgconnect.com/abstract/20024/immersive-vr-when-audio-catch
 
Description Interactive Audio Systems Symposium, September 23rd, 2016, University of York, United Kingdom. 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact Interactive Audio Systems Symposium

September 23rd, 2016, University of York, United Kingdom.

Symposium chair: Gavin Kearney

This was a one-day symposium dedicated to the topic of interactive audio systems. The symposium explored the perceptual, signal processing and creative challenges and opportunities arising from audio systems affected through enhanced human-computer interaction.

The symposium consisted of keynote talks, papers, posters and demonstrations and was sold out to an international audience. As a major output of the SADIE project, it demonstrated key technological developments from the project to an international audience. The proceedings are also online to contribute to the wider field of research in interactive audio.
Year(s) Of Engagement Activity 2016
URL https://www.york.ac.uk/sadie-project/IASS2016.html
 
Description Interactive Auralisation for Live Theatrical Performance, BBC Sound: Now and Next, 19th/20th May 2015. 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? Yes
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Interactive Auralisation for Live Theatrical Performance,
Helena Daffern, Gavin Kearney, Alex Southern, Will Smith and Damian Murphy
BBC Sound: Now and Next, 19-20 May, 2015.

On 19th and 20th May, 2015 BBC's Broadcast house was home to a two day event on the future of broadcast audio technology. 'Sound: Now & Next' featured talks from audio producers, artists and engineers as well as a technology fair demonstrating the latest developments in audio technology. The University of York's Audiolab were present as part of the BBC Audio Research Partnership. The team demonstrated recent work on a virtual version of York Theatre Royal. The project allows viewers to experience 360 degree visuals and auralisations based on acoustic measurements and multicamera recordings of performances in the theatre and includes interactive binaural rendering developed for the SADIE project.

Created new contacts for project partners for the AHRC 'Enhancing Audio Description' project.
Year(s) Of Engagement Activity 2015
URL http://www.bbc.co.uk/rd/events/sound2015
 
Description Multi-modal experience of a virtual concert hall: A tool for exploring the influence of the visual modality in interactive real-time room acoustic simulation, ICMEM, Sheffield 2015 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? Yes
Geographic Reach International
Primary Audience Other audiences
Results and Impact Multi-modal experience of a virtual concert hall: A tool for exploring the influence of the visual modality in interactive real-time room acoustic simulation,
Jude Brereton, Alex Southern and Gavin Kearney.
3rd International Conference on the Multimodal Experience (ICMEM) of Music, 23rd-25th March.

ICMEM brought together researchers from various disciplines who investigate the multimodality of musical experiences from different perspectives. Dr. Jude Brereton and Dr. Gavin Kearney from the University of York presented a demonstration of the Virtual Singing Studio which is a loudspeaker based room acoustics simulation for real-time musical performance. Participants at the conference were able to transport themselves to an immersive interactive simulation of the National Centre for Early Music (York) and were able to experience the same visual and acoustic cues as they would in the real venue. Users could also acoustically interact with the virtual space, demonstrating the ways in which immersive VR can be used as a rehearsal tool for musicians in preparation for performances in real (or remote) spaces. The demonstrated version of the Virtual Singing Studio utilised an Ambisonic engine developed for the SADIE project.
Main outcomes:
- Encouraged discussion around the technology
- Made practitioners and academics aware of the technologies capabilities.
- Practitioners stated that they found the technology exciting and inspiring and could see many applications for it in their own work.

- Invited to submit a paper to a special issue of Psychomusicology on multi-modal experiences in music.
Year(s) Of Engagement Activity 2015
URL https://www.sheffield.ac.uk/polopoly_fs/1.448256!/file/ICMEMFinalProgram.pdf
 
Description Spatial Audio for Domestic Interactive Entertainment: Possiblilities and Limitations. January 4th 2015, Sonic Arts Research Centre, Queens University Belfast 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact Spatial Audio for Domestic Interactive Entertainment: Possiblilities and Limitations.
Dr. Gavin Kearney, January 4th 2015, Sonic Arts Research Centre, Queens University Belfast.

Recent advances in immersive video technology, such as ultra-high definition video or 3-D visual headsets like the Oculus rift, are paving the way for immersive 360-degree virtual reality media systems in the home. Surround sound systems will naturally accompany such video technologies but reproduction in domestic media 'caves' or on headsets is not without significant challenges. This talk discussed the role spatial audio will play in the development of personal and domestic VR systems and looked at the technological and content creation possibilities and limitations for immersive audio-visual media.
Year(s) Of Engagement Activity 2015
 
Description WISHED: Well-being - Investigating Singing Health in Ensembles through Digital Technologies, Research showcase, AHRC Commons York 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Daffern, H. and Kearney, G., Priming New Research in Creativity, WISHED: Well-being - Investigating Singing Health in Ensembles through Digital Technologies, AHRC Commons, York, UK, June 2016.

Demonstration of a prototype interactive audio system aimed at improving health and well being through singing in a VR environment. The demo was housed at a stand in the main exhibition part of the AHRC commons event. Participants put on a VR headset as well as a mic and headphones and were virtually transported to a performance space that they could fully acoustically interact with. The demo utilised binaural technology developed as part of the SADIE project. Participants were able to discuss the technical details as well as the potential of the technology with the investigators. This opened up avenues of collaboration that the team are looking to capitalise on.
Year(s) Of Engagement Activity 2016
URL http://www.ahrc.ac.uk/about/ahrc-commons/
 
Description Workshop on Virtual Reality Spatial Audio, AES Audio for Games Conference, London, 11th February 2016. 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact Spatial Audio for VR
Gavin Kearney (Chair), AudioLab, Department of Electronics, University of York
Marcin Gorzel and Alper Gungormusler, Google Inc.
Pedro Corvo, Playstation VR, Sony Computer Entertainment Europe Ltd.
Jelle Van Mourik, Playstation VR, Sony Computer Entertainment Europe Ltd.
Varun Nair, Two Big Ears Ltd.

This workshop focused on rendering truly dynamic and spatially coherent mixes for Virtual Reality. The panel presented practical workflows for mixing and rendering 3-D sound for VR and explored the challenges in delivering dynamic spatial audio over a variety of VR technologies and applications. The AES Audio for Games conference took place on the 10th to 12th of February, 2016 at the Royal Society of Chemistry in the centre of London. The workshop was followed by an open panel discussion that sparked much debate and conversation amongst the patrons of the event about the future of audio for virtual reality.
Year(s) Of Engagement Activity 2016
URL http://www.audioforgames.net/2016/timetable/#VRSpatial
 
Description You are surrounded: Surround Sound, Past, Present and Future. York Festival of Ideas, June 16th, 2016 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact Abstract:
You are Surrounded! - Surround Sound Past, Present and Future, June 16th 2016, University of York

Surround sound is a way to create immersive audio experiences through placing multiple loudspeakers around an audience, in the cinema, theatre or at home. The development of surround sound as we know it has had a rich and varied history but has largely focused on increasing the number of loudspeakers around the audience for a more immersive experience. But can this strategy realistically be employed in the home? Join Gavin Kearney of the University of York to discuss - and listen to - how surround sound experiences have evolved from 16th Century Venetian choral performances to the large scale cinematic installations we hear today. Find out what recent research in surround sound technology means for the future of surround sound in the home.

Date: Thursday 16 June 2016, 6.30pm to 7.30pm
Venue: Holbeck Cinema, Department of Theatre, Film and Television, University of York
Admission is free, but booking is required.

Impact summary: Attendees were mostly from the general public. Awareness was made of recent developments in surround sound technology and how this could impact on everyday life through binaural listening on headphones and loudspeaker based surround sound.
Year(s) Of Engagement Activity 2016
URL http://yorkfestivalofideas.com/2016/talks/surround-sound-past-present-future/
 
Description YouTube live-streams in virtual reality and adds 3D sound - BBC News online, 18th April 2016 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact Online article describing the development of spatial audio in the Youtube pipeline.
Kearney interviewed based on expertise on Spatial Audio and involvement in SADIE project.
Resulted in strong international interest in SADIE project and interest from other potential funders.
Year(s) Of Engagement Activity 2016
URL http://www.bbc.co.uk/news/technology-36073009