Eye tracking for avatar eye gaze control during object-focused multiparty interaction in immersive collaborative virtual environments
Steptoe, W, Oyekoya, O, Murgia, A, Wolff, R, Rae, J, Guimaraes, E, Roberts, D and Steed, A 2009, Eye tracking for avatar eye gaze control during object-focused multiparty interaction in immersive collaborative virtual environments , in: IEEE Virtual Reality, 14-15 March 2009, Lafayette, Louisiana USA.
- Published Version
Restricted to Repository staff only
Download (8MB) | Request a copy
In face-to-face collaboration, eye gaze is used both as a bidirectional signal to monitor and indicate focus of attention and action, as well as a resource to manage the interaction. In remote interaction supported by Immersive Collaborative Virtual Environments (ICVEs), embodied avatars representing and controlled by each participant share a virtual space. We report on a study designed to evaluate methods of avatar eye gaze control during an object-focused puzzle scenario performed between three networked CAVETM -like systems. We compare tracked gaze, in which avatars’ eyes are controlled by head-mounted mobile eye trackers worn by participants, to a gaze model informed by head orientation for saccade generation, and static gaze featuring non-moving eyes. We analyse task performance, subjective user experience, and interactional behaviour. While not providing statistically significant benefit over static gaze, tracked gaze is observed as the highest performing condition. However, the gaze model resulted in significantly lower task performance and increased error rate.
|Item Type:||Conference or Workshop Item (Paper)|
|Themes:||Subjects outside of the University Themes|
|Schools:||Schools > School of Nursing, Midwifery, Social Work & Social Sciences > Centre for Nursing, Midwifery, Social Work & Social Sciences Research|
|Depositing User:||Users 29196 not found.|
|Date Deposited:||21 Dec 2011 12:07|
|Last Modified:||29 Oct 2015 00:34|
Actions (login required)
|Edit record (repository staff only)|