User Interaction with self-supporting free-form physical objects

Lead Research Organisation: University of Sussex
Department Name: Sch of Engineering and Informatics

Abstract

The primary goal of this project is enhance human-computer interaction by dynamically creating and manipulating physical shapes by levitating and moving a large collections of lightweight 3D objects using principles of acoustic levitation. The proposed idea is illustrated in Figure 1 where one ultrasound transducers is placed as a floor mat and a large collection of polystyrene beads create the shape of a dog that can wag its tail.

This will enable us to represent complex datasets in a physical form and allow users to dynamically manipulate it. We are now moving away from traditional human-computer interaction techniques like buttons, keyboards and mice to touch (e.g., Smartphones and multi-touch gestures) and touchless interactions (as with the Kinect and Leap Motion controllers). The limiting factor in these new forms of interactions is, ironically, the lack of any physicality and lack of any co-located feedback. One has no controller or interface element to physically touch and interact with and the visual feedback that may be available is disconnected from the location of the gesture.

In our vision, the computer will control the existence, form, and appearance of complex levitating objects composed of "levitating atoms". Users can reach into the levitating matter, feel it, manipulate it, and hear how they deform it with all feedback originating from the levitating object's position in mid-air, as it would with objects in real life. This will change how people use technology as they can interact with technology in the same way they would with real objects in their natural environment.

We see many possible benefits of physicalisations for the individual and society: they make data more accessible by leveraging our perceptual exploration skills via active perception, depth perception, non-visual senses, and intermodal perception; give novel exploration possibilities to the visually impaired; support learning via cognitive benefits of direct manipulation of physical artefacts; bring data to the real world for communication and exhibition and; and finally act as tools for engaging audiences with information.

Planned Impact

One of the aims of this project is to explore the roles levitation and physicalisations can play in public engagement with science, including public engagement with areas of policy that have technical content such as energy and climate change. This project has practical roles to play in supporting and fostering dialogue as well as in provoking interest in technical topics. The key will be to integrate the principles learned from WP1 - WP3 to engage the public with good storytelling.

Through the collaboration of Carbon Visuals, an Installation Artist will be responsible for creating novel interactive art installations that exploit the various research tools and systems developed by the project. Our system will empower new approaches to explore, manipulate and understand complex 3D representations of different types of data and interactive objects. By providing direct manipulation of physical objects in a 3D space combined with rich tactile feedback, users will be able to feel rather than simply look at their interactions. Consequently, researchers and interaction designers will be able to take advantage of the increased engagement to offer new powerful active exploration techniques with physical representations that have the potential to stimulate reasoning and analytical skills in new ways. This new interaction paradigm will also diminish the transition cost between novice and expert, allowing users to reach high performance for a small learning investment. Since direct 3D manipulation relies on our natural human skills for interacting with physical objects, the new interaction techniques will be easier to learn and master than their desktop computer counterparts using mice or keyboards. This will extend users' abilities to resolve complex 3D spatial-based problems quickly.

Our vision enables one to represent complex datasets in a physical form allowing users to analyse and perceive complex time-varying 3D structures through physicalisations. This idea to represent data in physical form is not new. Physicalisations as stone or pebble tokens have already been used thousands of years ago. More than 7000 years ago, the Sumerians, for example, used clay tokens to represent quantitative data. Even today, physicalisations in the form of data sculptures are created and used by designers and artists as alternate methods for conveying datasets. They have projected communicative impact, quickly and easily helping users to understand, for example, how crime rates vary across a city or how an urban development will change the environment. Similarly, scientists use and have used physicalisations to help analyse and perceive complex time-varying 3D structures. For example, James Maxwell famously created physicalisations of thermodynamic surfaces to help perceive isopiestics and isothermals. This helped Maxwell visualize Gibbs' thermodynamic surface, which expressed the relationship between the volume, entropy, and energy of a substance at different temperatures and pressures.

However physicalisations in the past have lacked dynamicity, automation, and computation all of which can be enabled by this project. We see many possible benefits of physicalisations for the individual and society: they make data more accessible by leveraging our perceptual exploration skills via active perception, depth perception, non-visual senses, and intermodal perception; give novel exploration possibilities to the visually impaired; support learning via cognitive benefits of direct manipulation of physical artefacts; bring data to the real world for communication and exhibition and; and finally act as tools for engaging audiences with information.

Publications

10 25 50
publication icon
Bourland A (2017) Project Telepathy

publication icon
Marzo A (2018) The development of dynamic holographic acoustic tweezers in The Journal of the Acoustical Society of America

publication icon
Marzo A (2017) Realization of compact tractor beams using acoustic delay-lines in Applied Physics Letters

publication icon
Marzo A (2016) Taming tornadoes: Controlling orbits inside acoustic vortex traps in The Journal of the Acoustical Society of America

publication icon
Marzo A (2017) TinyLev: A multi-emitter single-axis acoustic levitator. in The Review of scientific instruments

publication icon
Marzo A (2018) Ultraino: An Open Phased-Array System for Narrowband Airborne Ultrasound Transmission. in IEEE transactions on ultrasonics, ferroelectrics, and frequency control

 
Title A portable acoustic tractor beam 
Description Tractor beams are mysterious rays that can grab and attract objects. The concept has been shown in science-fiction movies such as Star Wars or Star Trek and scientists have developed the theory using lasers. Recently, sound was used to create a working tractor beam that can move heavier objects made of different materials and that operates both in air or water without damaging the trapped objects. Our team, led by Dr. Asier Marzo, created a simplified tractor beam using readily available parts with a total cost of less than £70 and used it to engage with children and the general public to capture their imagination on the possibilities of acoustic waves. 
Type Of Art Artefact (including digital) 
Year Produced 2017 
Impact This was demonstrated as part of the Star Trek 50th anniversary event. Prof. Drinkwater was invited to discuss the science of Star Trek at a live recording of the Cosmic Shed podcast. Before the discussion there was a screening of The Wrath of Khan organised by Sunset Cinema. This was all done to celebrate the 50th anniversary of Star Trek. The event was held in the planetarium of the @Bristol science museum in September 2016. At the end of the recording, Prof. Drinkwater gave the first ever live demo of a sonic tractor beam. 
URL https://www.youtube.com/watch?v=0nh2IftOcI0
 
Description To date the project has mostly been exploring new ways in which we can control an acoustic field to enable different sorts of levitation.
a) The team from Bristol (led by Prof. Drinkwater) published multiple articles in Applied Physical Letters, PRL ,PNAS and other journals which have all been widely reported in the media.
b) The collaborative work with taste and smell researchers (Dr. Marianna Obrist) on food levitation has led to interests from many chefs and beverage companies to explore the use of levitation in novel dining experiences.
Exploitation Route Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Our work on controlling complex sound fields using metamaterials paves the way for a range of applications that do not rely on expensive phased arrays.

Our ability to create real-time and dynamically updatable acoustic holograms allows us manipulate levitated particles. This allows us to create mid-air displays, levitation of fabric for animation and many more.

The work will be beneficial to a range of communities from consumer electronics industry to nondestructive testing and medical imaging communities.
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software),Electronics,Manufacturing, including Industrial Biotechology

 
Description The PI, Co-I and research staff have regularly appeared on national and international media to promote acoustic levitation and explain its relevance to the general public. We appeared on BBC Click live that was filmed in front of a live audience in early Dec 2017 and broadcast(across the globe through BBC world service) in Jan 2018 and on Sky News in Feb 2018. We are currently engaged with some art councils to explore creating an art exhibition that is based on the outcomes of this project. Finally, we have spun-out a company called Metasonics that aims to commercialize the manipulation of sound using acoustic metamaterials.
First Year Of Impact 2017
Sector Creative Economy,Digital/Communication/Information Technologies (including Software),Electronics
Impact Types Societal,Economic

 
Description Royal Academy of Engineering Chair in Emerging Technologies
Amount £1,300,000 (GBP)
Funding ID CIET1718\14 
Organisation Royal Academy of Engineering 
Sector Learned Society
Country United Kingdom
Start 03/2018 
End 03/2028
 
Title Acoustic Levitation - Instructables 
Description The software, hardware and instructions allow any user to build their own acoustic levitator. Use acoustic waves to hold in mid-air samples such as water, ants or tiny electric components. This technology has been previously restricted to a couple of research labs (such as outs) but now any one can make it at their home. 
Type Of Technology Physical Model/Kit 
Year Produced 2017 
Impact An independent company has started a kickstarter campaign to exploit our open access platform. There are also many companies (like Diamond mining and beverage companies) that are talking to us about using the system for their own use. 
URL http://www.instructables.com/id/Acoustic-Levitator/
 
Company Name METASONICS LIMITED 
Description Metasonics technology offers ultra-high fidelity control over sound, giving you the capability to shape, direct and focus soundwaves in real time. Existing directional audio devices are static and cumbersome, making them unsuitable for personalised use. Metasonics employs acoustic metamaterials that can seamlessly adapt the direction and power of sound waves, enabling miniaturisation of the technology and offering the flexibility to do more with sound. 
Year Established 2018 
Impact too early to report anything yet. But we have lots of potential leads we are exploring.
Website https://www.metasonics.co.uk/
 
Description Appeared on National TV 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact Sriram Subramanian and Stephen Beckett (the Artist in Residence on this project) were both on BBC Click live that was broadcast internationally on the 7th of January 2018.
Year(s) Of Engagement Activity 2018
 
Description CES 2018 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact We demonstrated acoustic levtiation along with Acoustic metamaterials at CES 2018
Year(s) Of Engagement Activity 2018
URL https://techspark.co/metasonics-to-launch-sound-shifting-tech-at-global-tech-conference/
 
Description Interview for National News 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Media (as a channel to the public)
Results and Impact Bruce Drinkwater and Asier Marzo did a piece for Sky News on Acoustic Levitation. Subsequently, Sriram Subramanian went to the Sky News office in London to appear on their 8:45pm news on Feb 20.
Year(s) Of Engagement Activity 2018
 
Description Part of multiple popular TV programs 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact Our food levitation system which was done in collaboration with other Sussex researchers, was part of 2 international TV programs. A) Have I got News for you on BBC (in Nov 2017) and B) Wait Wait...Don't tell me on NPR in Feb 2018 (this is a US Radio show).
Year(s) Of Engagement Activity 2018
 
Description Public Engagement as part of New Scientist Live 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Public/other audiences
Results and Impact New Scientist invited us to be part of the Royal Academy of Engineering & Disney booth at their NewScientist Live event in Sept 2017. This was a 3 day event in London with about 600 people.
Year(s) Of Engagement Activity 2017
URL https://live.newscientist.com/new-scientist-live-2017/8k2a5452