Steptoe, W, Oyekoya, O, Murgia, A, Wolff, R, Rae, J, Guimaraes, E, Roberts, DJ and Steed, A 2009, Tracking for avatar eye-gaze control during object-focused multiparty interaction in immersive collaborative virtual environments , in: IEEE Virtual Reality, 14-15 March 2009, Lafayette, Louisiana USA.
In face-to-face collaboration, eye gaze is used both as a bidirectional signal to monitor and indicate focus of attention and action, as well as a resource to manage the interaction. In remote interaction supported by immersive collaborative virtual environments (ICVEs), embodied avatars representing and controlled by each participant share a virtual space. We report on a study designed to evaluate methods of avatar eye gaze control during an object-focused puzzle scenario performed between three networked CAVEtrade-like systems. We compare tracked gaze, in which avatars' eyes are controlled by head-mounted mobile eye trackers worn by participants, to a gaze model informed by head orientation for saccade generation, and static gaze featuring non-moving eyes. We analyse task performance, subjective user experience, and interactional behaviour. While not providing statistically significant benefit over static gaze, tracked gaze is observed as the highest performing condition. However, the gaze model resulted in significantly lower task performance and increased error rate.
Actions (login required)
| ||Edit record (repository staff only)|