Aligning audio and visual cues when presenting fast moving sound sources within a multisensory virtual environment

Drumm, IA and O'Hare, J 2016, Aligning audio and visual cues when presenting fast moving sound sources within a multisensory virtual environment , in: ICSV 2016, July 2016, Anthens.

[img] PDF - Published Version
Restricted to Repository staff only

Download (436kB) | Request a copy

Abstract

This paper will address challenges in aligning audio and visual cues when rendering fast moving objects within a high end multi-sensory virtual environment facility which employs 3D stereo visual projection and wave field synthesis. The visual and audio systems are linked via a network connection and updates from the visual system occur at discrete time intervals. This paper will demonstrate and assess the use of motion prediction strategies for the optimum updating of dynamic audio scenes independently of the constraints presented by the visual rendering system and network communication. This work has proven particularly useful for ecologically valid simulations of road traffic, rail and urban soundscapes.

Item Type: Conference or Workshop Item (Paper)
Schools: Schools > School of Computing, Science and Engineering > Salford Innovation Research Centre (SIRC)
Depositing User: IA Drumm
Date Deposited: 10 Mar 2017 10:33
Last Modified: 08 Aug 2017 14:50
URI: http://usir.salford.ac.uk/id/eprint/41566

Actions (login required)

Edit record (repository staff only) Edit record (repository staff only)

Downloads

Downloads per month over past year