Skip to main content

Research Repository

Advanced Search

Aligning audio and visual cues when presenting fast moving sound sources within a multisensory virtual environment

Drumm, IA; O'Hare, JJ

Authors

IA Drumm

JJ O'Hare



Abstract

This paper will address challenges in aligning audio and visual cues when rendering fast moving objects within a high end multi-sensory virtual environment facility which employs 3D stereo visual projection and wave field synthesis. The visual and audio systems are linked via a network connection and updates from the visual system occur at discrete time intervals. This paper will demonstrate and assess the use of motion prediction strategies for the optimum updating of dynamic audio scenes independently of the constraints presented by the visual rendering system and network communication. This work has proven particularly useful for ecologically valid simulations of road traffic, rail and urban soundscapes.

Citation

Drumm, I., & O'Hare, J. (2016, July). Aligning audio and visual cues when presenting fast moving sound sources within a multisensory virtual environment. Presented at ICSV 2016, Anthens

Presentation Conference Type Other
Conference Name ICSV 2016
Conference Location Anthens
Start Date Jul 1, 2016
Publication Date Jul 1, 2016
Deposit Date Mar 10, 2017
Publisher URL http://www.iiav.org/index.php?va=congresses
Additional Information Event Type : Conference


Downloadable Citations