Drumm, IA ORCID: https://orcid.org/0000-0002-8894-1475 and O'Hare, JJ
ORCID: https://orcid.org/0000-0001-5209-7754
2016,
Aligning audio and visual cues when presenting fast moving sound sources within a multisensory virtual environment
, in: ICSV 2016, July 2016, Anthens.
![]() |
PDF
- Published Version
Restricted to Repository staff only Download (436kB) | Request a copy |
Abstract
This paper will address challenges in aligning audio and visual cues when rendering fast moving objects within a high end multi-sensory virtual environment facility which employs 3D stereo visual projection and wave field synthesis. The visual and audio systems are linked via a network connection and updates from the visual system occur at discrete time intervals. This paper will demonstrate and assess the use of motion prediction strategies for the optimum updating of dynamic audio scenes independently of the constraints presented by the visual rendering system and network communication. This work has proven particularly useful for ecologically valid simulations of road traffic, rail and urban soundscapes.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Schools: | Schools > School of Computing, Science and Engineering > Salford Innovation Research Centre |
Depositing User: | Dr I Drumm |
Date Deposited: | 10 Mar 2017 10:33 |
Last Modified: | 15 Feb 2022 21:49 |
URI: | https://usir.salford.ac.uk/id/eprint/41566 |
Actions (login required)
![]() |
Edit record (repository staff only) |