GAIME: Gestural and Audio Interactions for Mobile Environments

Lead Research Organisation: University of Glasgow
Department Name: School of Computing Science

Abstract

Most PDAs and smart phones have sophisticated graphical interfaces and commonly use small keyboards or styli for input. The range of applications and services for such devices is growing all the time. However, there are problems which make interaction difficult when a user is on the move. Much visual attention is needed to operate many of the applications, which may not be available in mobile contexts. Oulasvirta et al. [29] showed that attention can become very fragmented for users on the move as it must shift between navigating the environment and the device, making interaction hard. Our own research has shown that performance may drop by more than 20% when users are mobile [4]. Another important issue is that most devices require hands to operate many of the applications. These may not be available if the user is carrying bags, holding on to children or operating machinery, for example. The novel aspect of this proposal is to reduce the reliance on graphical displays and hands by investigating gesture input from other locations on the body com-bined with three-dimensional sound for output.Little work has gone into making input and control hands-free for mobile users. Speech recognition is still problematic in such settings due to its high processing requirements and the dynamic audio environments in which devices are used. Much of the research on ges-ture input still uses hands for making the gestures. There is some work on head-based input, often for users with disabilities [26], but little of this has been used in mobile settings. Our own previous work has begun to examine head pointing and showed that it might be a useful way to point and select on the move [3].Many other body locations could be useful for subtle and discreet input whilst mobile (e.g., users walking or sitting on a bumpy train). For example, wrist rotation has potential for controlling a radial menu as the wrist can be rotated to move a pointer across the menu. It is unobtrusive and could be tracked using the same sensor used for hand pointing gestures (in a watch for exam-ple). Small changes in gait are also a possibility for in-teraction. In previous work [12] we extracted gait in-formation from an accelerometer on a PDA to look at usability errors. We can adapt this technique so that users could slightly change the timing of a step to make input. There has been no systematic study of the differ-ent input possibilities across the body. We will develop a novel testing methodology using a Fitts' law analysis along with more subjective measures to find out which body locations are most useful for input on the move.Output is also a problem due to the load on visual atten-tion when users are mobile. We and others have begun to look at the use of spatialised audio cues for output when mobile as an alternative or complement to graph-ics [1, 6] [19, 32]. Many of these use very simple 3D audio displays, but, with careful design, spatial audio could provide a much richer display space. Our Audio-Clouds project built some foundations for 3D audio interactions, investigating basic pointing behaviour, target size and separation [1,3]. We need to now take this work forward and develop more sophisticated inter-actions. Key aspects here are to develop the use of ego-centric (fixed to the user) and exocentric (fixed to the world) displays, and how they can be combined to cre-ate a rich 3D display space for interaction.The final key part of this project is to create compelling applications which combine the best of the audio and gestures. We can then test these with users in more realistic settings over longer time periods to fine-tune how these interactions work in the real world.

Publications

10 25 50

publication icon
Hoggan E (2009) Audio or tactile feedback

publication icon
Vazquez-Alvarez Y (2011) Auditory display design for exploration in mobile audio-augmented reality in Personal and Ubiquitous Computing

publication icon
Vazquez-Alvarez Y (2015) Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality in ACM Transactions on Computer-Human Interaction

publication icon
Vazquez-Alvarez Y (2011) Eyes-free multitasking

publication icon
Williamson J (2010) Social gravity

 
Description developed new methods to allow interaction without the use of eyes or hands.
Exploitation Route we provided a set of 3d audio and gesture based interaction techniques that others could use to enrich mobile interactions
Sectors Digital/Communication/Information Technologies (including Software)

URL http://www.gaime-project.org/
 
Description collaborated with nokia on the project. they provided some funding and hosted a student for an internship. we developed a range of new interaction techniques for lots of different settings.
First Year Of Impact 2010
Sector Digital/Communication/Information Technologies (including Software)
Impact Types Economic

 
Description HP Labs
Amount £7,000 (GBP)
Funding ID HP project 
Organisation HP Laboratories 
Sector Private
Country United States
Start 01/2010 
End 12/2011
 
Description Nokia
Amount £40,007 (GBP)
Funding ID Nokia university donations 
Organisation Nokia 
Sector Private
Country Global
Start  
 
Description Nokia
Amount £70,000 (GBP)
Funding ID 1/2 funded PhD studentship 
Organisation Nokia 
Sector Private
Country Global
Start 03/2010 
End 12/2013
 
Description Nokia
Amount £40,007 (GBP)
Funding ID Nokia university donations 
Organisation Nokia 
Sector Private
Country Global
Start