Skip to main content

Research Repository

Advanced Search

Eye tracking for avatar eye gaze control during object-focused
multiparty interaction in immersive collaborative virtual environments

Steptoe, W; Oyekoya, O; Murgia, A; Wolff, R; Rae, J; Guimaraes, E; Roberts, D; Steed, A

Authors

W Steptoe

O Oyekoya

A Murgia

R Wolff

J Rae

E Guimaraes

D Roberts

A Steed



Abstract

In face-to-face collaboration, eye gaze is used both as a bidirectional
signal to monitor and indicate focus of attention and action,
as well as a resource to manage the interaction. In remote interaction
supported by Immersive Collaborative Virtual Environments
(ICVEs), embodied avatars representing and controlled by each participant
share a virtual space. We report on a study designed to evaluate
methods of avatar eye gaze control during an object-focused
puzzle scenario performed between three networked CAVETM -like
systems. We compare tracked gaze, in which avatars’ eyes are
controlled by head-mounted mobile eye trackers worn by participants,
to a gaze model informed by head orientation for saccade
generation, and static gaze featuring non-moving eyes. We analyse
task performance, subjective user experience, and interactional
behaviour. While not providing statistically significant benefit over
static gaze, tracked gaze is observed as the highest performing condition.
However, the gaze model resulted in significantly lower task
performance and increased error rate.

Citation

multiparty interaction in immersive collaborative virtual environments. Presented at IEEE Virtual Reality, Lafayette, Louisiana USA

Presentation Conference Type Other
Conference Name IEEE Virtual Reality
Conference Location Lafayette, Louisiana USA
Start Date Mar 14, 2009
End Date Mar 15, 2009
Publication Date Jan 1, 2009
Deposit Date Dec 21, 2011
Additional Information Event Type : Conference



Downloadable Citations