Acoustic Holography for Multimodal 3D Display and Fabrication

Lead Research Organisation: University College London
Department Name: Computer Science

Abstract

The aim of the project is to use recent advances in acoustic holography and high-performance computational techniques to create multimodal interactive applications that dynamically combine computational fabrication with visual, tactile, auditory, olfactory and gustatory experiences all using the same holography principles. The ambition of this project is to create systems that empower the design community to embrace the power of acoustic holography in creating applications that can create and manipulate both digital
and physical artefacts.
Acoustic holography has shown its capability as Mixed-Reality (MR) displays in providing five modalities and its potential as a new computational fabrication technique that allows multi-material and multi-resolution 3D printing. However, due to computational limitations, such endeavours have been limited to one-modality at a time or one-off carefully orchestrated examples of combinations, and no holographic 3D fabrication has been demonstrated. A serious limitation in current approaches is in real-time computation of the sound fields that account for sound scatterings. This limitation is hindering our ability to make full use of the power of acoustic holography to create the plethora of applications that are ripe for exploitation.
In this proposal, we will develop a real-time form-factor-agnostic sound field computation and explore optimum form factor to maximize users' multimodal experiences through several prototyping of interactive applications. The prototype applications will not only demonstrate the ability of acoustic holography to create magical experiences but will also provide easy-to-use tools for the community to integrate such systems in real-world applications. We will create an interactive application that blurs the boundary between displays and 3D printing where users can instantly 3D print physical prototypes and integrate them in an MR environment while going back-and-forth between 3D printing and MR-exploration.

Publications

10 25 50