Multimodal, Negotiated Interaction in Mobile Scenarios

Lead Research Organisation: University of Glasgow
Department Name: School of Computing Science

Abstract

Our proposal is to investigate an alternative means of allowing users to interact with content and services in their environment such that the actions they make, movements, gestures, etc., and feedback they receive are continuous, with the user and system negotiating their interactions in a fluid, dynamic way. We believe the appropriate comparison would be dancing, rather than the current command & control metaphor. When someone dances with a partner there is a soft ebb and flow of control; sometimes one person leads, sometimes the other, this changing fluidly as they dance. We are proposing a similar interaction between a user and computer, where sometimes the user leads and at other times the computer according to the context of the interaction. This contrasts with most current approaches where one agent, be it the human or the computer, pre-empts the other and where most interaction is driven by events and proceeds to varying degrees in rigid, over specified waysExample scenario: Exploring a Digitally Enriched EnvironmentAn example of how this approach could be used is that of location-aware information acquisition while walking in a town centre. You might feel a 'tick' on your phone's vibration motor, making you aware that there is information available about something in your environment. Your rich context understanding abilities would tell you how likely this 'tick' was to be of interest, if you ignore the cue and walk on, the negotiation would end there and then. If you are curious, you might gesture with the phone at likely targets in your surroundings, and get a response from several of them. If you are further intrigued, you may continue to interact with these potential targets, possibly moving from the vibro-tactile to an audio display, gaining information by an active exploration of the environment, something we have evolved to do naturally. The user explores the possibilities in the situation by directly engaging (probing or playing) with it, being able to move at will through the space of possibilities, gaining more and more insight during the interaction. The multimodal feedback provided both encodes the system's current interpretation of the user's intention (e.g. moving towards a target) and the probability of the target meeting the user's needs. After working through combinations of vibration and audio, if the joint dynamics of information source, and user continue to intertwine, the display of the mobile device might be used for full details. This example shows a 'schedule' of modalities, and illustrates the negotiation process in a practical and commercially interesting example.

Publications

10 25 50
publication icon
Murray-Smith R (2008) Fun and Games

publication icon
Murray-Smith R (2009) Empowering People Rather Than Connecting Them in International Journal of Mobile Human Computer Interaction

publication icon
Robinson S (2009) Exploring casual point-and-tilt interactions for mobile geo-blogging in Personal and Ubiquitous Computing

publication icon
Robinson,S. (2009) Evaluating haptics for information discovery while walking in Proceedings of the British HCI Group Annual Conference on People and Computers

publication icon
Strachan S (2009) Location and Context Awareness

publication icon
Strachan S (2008) Bearing-based selection in mobile spatial interaction in Personal and Ubiquitous Computing

 
Description We investigated alternative means of allowing users to interact with content and services in their environment such that the actions they make, movements, gestures, etc., and feedback they receive are continuous, with the user and system negotiating their interactions in a fluid, dynamic way. We believe the appropriate comparison would be dancing, rather than the current command & control metaphor. When someone dances with a partner there is a soft ebb and flow of control; sometimes one person leads, sometimes the other, this changing fluidly as they dance. We are proposing a similar interaction between a user and computer, where sometimes the user leads and at other times the computer according to the context of the interaction. This contrasts with most current approaches where one agent, be it the human or the computer, pre-empts the other and where most interaction is driven by events and proceeds to varying degrees in rigid, over specified ways.

We examined a range of scenarios focussed around digital-physical navigation and collaboration, whic hare described in several published papers (e.g. 'I Did It My Way': Moving Away from the Tyranny of Turn-by-Turn Pedestrian Navigation, and Social Gravity: A Virtual Elastic Tether for Casual, Privacy-Preserving Pedestrian Rendezvous). We also developed novel approaches to touch interaction which use mathematical models to allow richer interaction from the whole finger pose, rather than just the tip of the finger (see AnglePose: robust, precise capacitive touch tracking via 3D orientation estimation). This allows you to use the style of finger-touch to control the interface, effectively allowing your finger to be joystick anywhere on the screen. We also developed a library of software that allows richer audio synthesis on mobile phones like the iPhone, in order to allow richer interaction in future systems.

Our work has been disseminated through: top-quality, peer-reviewed journal and conference publications; formal collaboration and secondments with leading industrial research and development organisations; licensing of software; print and broadcast media; and, a large number of invited talks and demonstrations.
Sectors Digital/Communication/Information Technologies (including Software),Leisure Activities, including Sports, Recreation and Tourism,Culture, Heritage, Museums and Collections,Security and Diplomacy,Transport

URL http://www.negotiatedinteraction.com
 
Description The work on 'Stane' audio input technology fed into a new research project by Prof Jones which used the idea as a tap input option on entry level mobile phones in developing countries. It also fed into new audio/touch input approaches in ubiquitous computing settings. The work on pedestrian navigation had an impact on a major mobile phone manufacturer's way of thinking about pedestrian navigation with mobile phones, and later led to alternative approaches for mobile devices without GPS being used in Developing countries. The multimodal approach to interaction made it into a mobile phone product of one of the world's largest mobile phone manufacturers, supported by IP licensing from the University of Glasgow
First Year Of Impact 2010
Sector Digital/Communication/Information Technologies (including Software)
Impact Types Societal

 
Description Nokia Devices
Amount £142,000 (GBP)
Organisation Nokia 
Sector Private
Country Global
Start  
 
Description Nokia Devices
Amount £142,000 (GBP)
Organisation Nokia 
Sector Private
Country Global
Start  
 
Description Nokia Devices R&D
Amount £13,000 (GBP)
Organisation Nokia 
Sector Private
Country Global
Start  
 
Description Nokia Devices R&D
Amount £13,000 (GBP)
Organisation Nokia 
Sector Private
Country Global
Start  
 
Description Orange/FT Research labs
Amount £59,000 (GBP)
Organisation Orange France Telecom 
Department Orange France Telecom Research labs
Sector Private
Country France
Start  
 
Description Orange/FT Research labs
Amount £59,000 (GBP)
Organisation Orange France Telecom 
Department Orange France Telecom Research labs
Sector Private
Country France
Start