Eye Catching: Supporting tele-communicational eye-gaze in Collaborative Virtual Environments

Lead Research Organisation: University of Roehampton
Department Name: Psychology

Abstract

Driven by the potentials and demands of an increasing global market and fed by advances in information and communication technology, one of the trends in the modern workplace is for more distributed team working. Distributed working has long been a major topic in computer science, but despite excellent work and the development of highly sophisticated computer-supported cooperative work (CSCW) systems, in many situations there is no substitute for a face-to-face meeting. The consequent demands for travel to meetings have immediate short term quality of life and productivity impacts on individuals. There may also be more far-reaching and profound implications of our current reliance on long distance travel. Thus it is still very relevant to try to determine why some CSCW fails and to research possible technologies for expanding the situations for which face to face meetings can be avoided. There are numerous common collaborative scenarios that require a more natural way of interacting across a distance. Specifically we have identified conscious and subconscious communication of attention and emotion as common critical elements that make many such scenarios hard to support without eye-gaze. Eye-gaze is a key interactional resource in collaboration but it is not well supported in today's communication technology. Indeed many have claimed that lack of ability to faithfully represent eye-gaze is the key failing of current CSCW systems. Within today's video based systems eye-gaze can be maintained in some limited way if the user is willing to look directly at a camera, but this is unnecessarily constraining in a social situation, especially during object or environment focussed collaboration.We propose to evaluate the role of eye-gaze in tele-communication so as to better design future communication technologies. To do this we will build the world's first tele-collaboration system that supports two and three way communicational eye-gaze without restricting the gaze direction of participants. We will integrate eye-tracking technologies into Immersive Projection Technology (IPT) displays, and develop the software necessary to build a consistent collaborative virtual environment where each participant can see the other and accurately track their eye-gaze. To prove the utility of this system we will compare it to AccessGrid technology which provides state-of-the-art video conferencing on large wall displays. Although unable to support communicational eye-gaze between moving participants, AccessGrid does offer advantages in terms of placement within working environments and realism of representation. Comparison between the two approaches will provide valuable insight into future development of each. Through a series of experiments we will establish what conditions are necessary and sufficient to support communicational eye-gaze in a tele-communication system; validate the support of eye-gaze in tele-communication by measuring its impact on collaboration; measure the impact of technology approaches and variables; establish when eye-gaze is important; and establish situations where eye-gaze is critical for successful collaboration at a distance.
 
Description Despite and the development of highly sophisticated computer-supported cooperative work (CSCW) systems, which aim to allow physically distant participants to interact via telecommunication systems, in many situations there is no substitute for meeting face-to-face. However the demands of travelling to meetings have a profound impact on people's quality of life and productivity and our current reliance on long-distance travel may have serious environmental implications. Consequently, it is important to determine why some CSCW systems fail and to research technologies that can reduce the need for face-to-face meetings.

A key issue is that seeing other people's eye-gaze is an important interactional resource, it is involved in the communication of focus of attention and of emotion states, but is not well-supported by current telecommunication technology. Current video-based systems can support eye-gaze if users are positioned such that they look directly at a camera; this can be highly restrictive, especially in interactions involving moving around an object.



This project evaluated the potential of tele-communication systems using Immersive Projection Technology (IPT) displays in which, rather than seeing video images of each other, participants meet avatars (graphical representations) of one another in a shared virtual space, or Immersive Collaborative Virtual Environment (iCVE). Working closely with computer scientists, we contributed psychological and social scientific expertise to the building and evaluation of an IPT system that integrated eye-tracking technologies to capture each participant's eye-gaze behaviour and reproduce it on an avatar of that person. This is the world's first tele-collaboration system that supports two and three way eye-gaze without restricting participants' gaze direction.

Initial tests established that observers of an avatar with fixed-eyes found it very hard to judge which of a number of objects the person represented by the avatar was looking at. However when eye-tracking was added, the person's gaze could be judged very accurately. We used another task in which viewers made judgments about the gaze-direction of a remote participant to evaluate our IPT-based approach by comparing it with state-of-the-art video conferencing (with screens and high definition cameras arranged to provide optimal alignment across the remote spaces). We further tested the performance of the iCVE in an object-manipulation task (in which participants had to use eye-gaze) by comparing different avatar gaze conditions (eye-tracking; eye-gaze modelled based on previously collected parameters, and eyes-fixed straight ahead).

In addition to analyzing the data arising from these studies, in particular examining the organization of interactions in these experiments and comparing it to previously identified practices, we also collected further samples of face-to-face interactional behaviour from a variety of settings in order to better understand how eye-gaze is used.



Key findings were:

-Video conferencing allows gaze-direction to be judged accurately when the position of participants is constrained; the iCVE system enables this without this constraint.

-Video-technology and iCVE systems have complementary strengths and weaknesses; the former faithfully communicates a person's appearance, the latter faithfully represents where they are looking.

-Eye-tracking in an iCVE improves people's ability to judge gaze-direction in some but not all circumstances; in some situations head-orientation is sufficient.

-Avatar appearance, including accurate representation of eye movement, is an important determinant of success of these systems.

-In the iCVE, when difficulties in judging other's gaze-direction arise, parties use change of position as a resource, exploiting being in a shared virtual space.

-In the iCVE, certain interactional gazing practices cannot be simulated but need to be captured and reproduced.
Exploitation Route The idea of using graphic representations of remote participants appearance that show the participant's eye gaze have a number of applications (telecommunication, training) and could be developed commercially. This research contributes to our understanding of human interaction in face-to-face settings and in tele-mediated settings. The main implications are for the design of telecommunication systems. The research demonstrates the value of immersive projective technologies in telecommunication settings. It also suggests that providing participants at remote sites with visible access to how a participants are moving with respect to objects within their site can be important.
Sectors Digital/Communication/Information Technologies (including Software)

 
Description Avanti Communications Limited 
Organisation Avanti Communications
Country United Kingdom 
Sector Private 
Start Year 2006
 
Description Electrosonic Ltd 
Organisation Electrosonic
Country United Kingdom 
Sector Private 
Start Year 2006
 
Description Silicon Graphics Inc 
Organisation Silicon Graphics Inc
Country United States 
Sector Private 
Start Year 2006
 
Description VISUAL ACUITY LIMITED 
Organisation Visual Acuity Ltd
Country United Kingdom 
Sector Private 
Start Year 2006
 
Description "Interacting with things" 
Form Of Engagement Activity Scientific meeting (conference/symposium etc.)
Part Of Official Scheme? No
Primary Audience Postgraduate students
Results and Impact Research Seminar, Department of Education, Presentation at the University of Roehampton.
Year(s) Of Engagement Activity 2009
 
Description "Passing stuff: How do humans accomplish manual object transfers?" 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Primary Audience Postgraduate students
Results and Impact Departmental Research Presentation, Department of Psychology, University of Staffordshire.
Year(s) Of Engagement Activity 2010
 
Description Data Analysis Workshop 
Form Of Engagement Activity Scientific meeting (conference/symposium etc.)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact Data Analysis Worshop.
Year(s) Of Engagement Activity 2011
 
Description Hands, objects and courses of action: How handshape in handling objects can be interactionally relevant 
Form Of Engagement Activity Scientific meeting (conference/symposium etc.)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience
Results and Impact Abstract. In road traffic, a vehicle's position and orientation are major cues to the driver's intention (or lack of intention). As such, what someone is doing with an object can inform others about how they might participate in the unfolding course of action that that person is engaged in. Unlike the control of a vehicle, in the case of objects that are literally handled (that is objects that are manipulated manually, by the hands), properties of the hands themselves, such as finger positions, can be become relevant (Streeck, 2009). In such settings, it is not just that hands reflect key aspects of what a speaker is doing (McNeill, 1992) but that the properties of an object (e.g. its shape or position) can become a field within which the

hand(s) operate (Goodwin, 2000). Whilst previous research has examined how participants' talk, gaze and gestures can all be relevant for co-participants (e.g. Bolden, 2003) the focus of the present analysis is on manual configurations specifically.

This present paper aims to make a contribution to our understanding of how the position of the hands, in handling objects, can make available to others what an agent is doing, more specifically, how features of the course of action that an agent is engaged in can become available to others. We use Conversation Analysis to examine the social organization of manual object handling in video-recordings of several hours of naturally-occurring domestic interactions, in particular involving adult human participants engaged in the assembly of flat-pack furniture. We show how configurations of the hand, such as finger positions, are oriented to by co-participants, in particular how they can make available for them the course of action that an agent is engaged in. We discuss the the relevance of the visibility of hand-object configurations in telecommunication and virtual reality applications.
.

This is a presentation at an academic conference.
Year(s) Of Engagement Activity 2010
 
Description Passing stuff: How do humans accomplish manual object transfers? 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact Presentation to the University of Wisconson Madison

Social Psychology and Microsociology (SPAM) group.
Year(s) Of Engagement Activity 2011
 
Description Passing stuff: How do humans accomplish manual object transfers? 
Form Of Engagement Activity Scientific meeting (conference/symposium etc.)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience
Results and Impact The transfer of objects from one party to another is a routine human activity; the understanding of core aspects of it appear early in life (Lerner and Zimmerman, 2003, Wootton 1994). In addition to the perceptual and kinaesthetic skills involved, the passing of objects presents a coordination problem such that parties' bodies are brought into appropriate positions with respect to each other, and objects released and secured such that they are not dropped or their contents spilled.



This paper uses Conversation Analysis to examine the social organization of manual object transfers in video-recordings of several hours of naturally-occurring domestic interactions. These mainly involve adult participants engaged in the assembly of flat-pack furniture and include some food-preparation and mealtime interactions.



We show that manual object transfers have a distinctive phase-structure that participants orient to (Sacks and Schegloff 2002, Lerner and Raymond, 2008) involving preparation for transfer, the transfer itself and a post transfer phase. Whilst transfer itself involves the non-verbal actions of releasing and taking the object being passed, it routinely occurs with concurrent talk. We examine how talk from the recipient of the object can play a crucial role in the accomplishment of the transfer.
.

This is a presentation at an academic conference.
Year(s) Of Engagement Activity 2010
 
Description Showing how the work is done in art studio instructional interactions 
Form Of Engagement Activity Scientific meeting (conference/symposium etc.)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience
Results and Impact Instruction in craft practice makes extensive use of gesture in order to demonstrate the practical and aesthetic possibilities of hands and minds, tools and materials. The present report focuses on instruction concerning a specific a feature of manual work namely the progressive transformation of materials. Ingold has referred to "the processional character of tool use" (Ingold, 2006) and in an account that reflects on his own practice, articulates how the conduct of a basic action (sawing a plank) involves and requires different sawing action at different phases of the job as the cut progresses. The sites where craft skills are developed characteristically involve learning through observing or co-participating with expert practitioners (Marchand, 2008; Sennet, 2008) or through forms of guided participation in activities (Ekström, Lindwall, & Säljö, 2009). Drawing on Conversation Analysis to examine the sequential organization of actions in videorecordings of 16 classes in a printmaking studio and a metalworking studio, this paper shows how instructors use gesture in coordination with other resources (such as talk and the manipulation of tools) to display the progressive character of different jobs. In particular, the analysis shows how bodily conduct and facial expressions are used to show effort and care during different phases of procedures. Instructors reveal a concern with getting students to appreciate what should be done, the manner in which it should be done (e.g. what degree force or care is required); what should be looked for (or otherwise sensed) and what might be found. The paper concludes be discussing how instruction in this configuration of mindful action which can accommodate and control progressively unfolding work is fundamentally dependent on the coordination of gestural resources.
.

This is a research presentation at an academic conference.
Year(s) Of Engagement Activity 2012
 
Description Tele-mediated social interaction in Videoconferencing and in an Immersive Collaborative Virtual Environment. 
Form Of Engagement Activity Scientific meeting (conference/symposium etc.)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience
Results and Impact The very first words reputed to have been heard over a telephone system concern spoke to the participants' physical separation between the participants. (Bell to his assistant: "Mr. Watson come here, I want you.") The development of telecommunication systems that enable participants to see, as well as hear, each other seeks to advance on the telephone system. However the well-known problems of spatial alignment in videoconferencing can have the effect of making participants' spatial separation apparent. Whilst telephone talk can constitute and take place within a shared virtual aural space, in videoconferencing participants usually look out from their own -and gaze into each others' - separate spaces. An alternative means of supporting tele-mediated interaction is the use of virtual reality to create a shared virtual space in which participants are represented by avatars. This paper reports on the Eye-Catching project where participants interact in an Immersive Collaborative Virtual Environments (ICVE) in which, for the first time, their eye-gaze is captured by eye-trackers mounted in the stereo-glasses that are worn in the ICVE - and displayed by their avatars. We compare three way interaction within this environment with interactions in an AccessGrid videoconferencing environment. We examine video-recordings from head-mounted cameras showing each participant's view and which uses eye-tracking to show their point of gaze within that view. Using conversation analysis we examine the management of turn-taking and gaze distribution in these environments. Firstly we discuss the analytical impact of working with eye-tracking data. Secondly we assess the possible limitations and/or advantages of these different technologies for supporting tele-mediated social interaction. Thirdly we discuss how participants'understanding of space becomes evident in their conduct.

This is a presentation at an academic conference.
Year(s) Of Engagement Activity 2008
 
Description Transformations of gazing practices in the trajectory of humorous talk in a co-present, object-focused, multiparty task. 
Form Of Engagement Activity Scientific meeting (conference/symposium etc.)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience
Results and Impact Abbreviated Abstract: The organization of talk with respect to concurrent activities undertaken by co-present participants is little understood. Whilst the business of some settings is fundamentally accomplished through talk, in other settings it is the manipulation of objects that comprises the business at hand. Although talk may become relevant - indeed necessary - on occasion in such settings (e.g. in order to secure the coordination of actions), non task-related talk may be a possibility. Such talk appears to be highly important for the character of the setting and is evidently important in the management of the relationship between participants. This talk is often of a humorous nature. In this paper we examine the conduct of three people engaged in the assembly of an item of flat-pack furniture.

We show that a recurrent type of humorous talk involves:

(a) the occurrence of a task-related event

(b) humorous talk that retrospectively targets that event

(c) responses to the humorous talk that display an understanding of the humorous talk.



Consequently then, this type of talk, though not directly task-related, is connected with the concurrent task in two ways. Firstly it is rooted in task-related events; secondly it thematizes features of the current situation.

Such talk allows for transformations of the local participation framework, for example providing opportunities for participants who may be engaged in individual work on sub-tasks to engage or interact with other participants. Producers of humorous remarks regularly monitor their recipients by gazing at them. We discuss the importance of humorous talk within such settings and how it creates opportunities for affiliation through nonverbal displays of engagement and shared understanding.
.

This is a presentation at an academic conference.
Year(s) Of Engagement Activity 2009
 
Description Vocal and visible displays of stance in object-centred interactions. 
Form Of Engagement Activity Scientific meeting (conference/symposium etc.)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience
Results and Impact (Extracted from the Abstract:)

One fundamental locus for language use is in the context of handling objects. But the manipulation of objects, is rarely value-free, rather it is saturated with matters of taste. Consequently displays of stance and affect often occur in the course of projects that involve making or moving things and participants can be seen to use a range of linguistics resource to express such things as approval/disapproval, enthusiasm/reticence (Ochs & Schieffelin (1989). The present study aims to contribute to such studies by further examination of both family and peer-interactions in which participants are concerned with tasks that involve the handling of objects. The aim is to answer the following questions: How, and in what ways, do people working together on the transformation of materials, display stance and affect as part of organizing themselves to get that work done? This paper uses Conversation Analysis (CA) to examine videorecordings of interactions in which participants are concerned with doing things with material objects. The data involve interactions involving objects that are being made by the participants (e.g. cooking or assembly furniture). The analysis shows how participants'values and asymmetries in the participants'knowledge come to the surface focusing on (1) How advice- and instructiongiving can occur as situated practices and how their delivery is interactionally organized, (2) How stance and affect are manifested in these activities.The analysis shows how multimodal resources are implicated in these activities, in particular how the positioning of objects and persons can be such that participants analysis of something can be literally a matter of their stance and other visible conduct. In conclusion, we discuss the relevance of Conversation Analysis for understanding human sociality in particular how sensitivities, values and methods are passed on, shared or managed.

This is a presentation at an academic conference.
Year(s) Of Engagement Activity 2011