Skip to main content

Research Repository

Advanced Search

User-guided rendering of audio objects using an interactive genetic algorithm

Wilson, AD; Fazenda, BM

Authors

AD Wilson



Abstract

Object-based audio allows for personalisation of content, perhaps to improve accessibility
or to increase quality of experience more generally. This paper describes the design and evaluation
of an interactive audio renderer, which is used to optimise an audio mix based on the
feedback of the listener. A panel of 14 trained participants were recruited to trial the system.
The range of audio mixes produced using the proposed system was comparable to the range
of mixes achieved using a traditional fader-based mixing interface. Evaluation using the System
Usability Scale showed a low level of physical and mental burden, making this a suitable
interface for users with impairments, such as vision and/or mobility.

Citation

Wilson, A., & Fazenda, B. (2019). User-guided rendering of audio objects using an interactive genetic algorithm. Journal of the Audio Engineering Society, 67(7/8), 522-530. https://doi.org/10.17743/jaes.2019.0035

Journal Article Type Article
Acceptance Date Jul 17, 2019
Publication Date Aug 14, 2019
Deposit Date Aug 7, 2019
Publicly Available Date Oct 10, 2019
Journal Journal of the Audio Engineering Society
Print ISSN 1549-4950
Publisher Audio Engineering Society
Volume 67
Issue 7/8
Pages 522-530
DOI https://doi.org/10.17743/jaes.2019.0035
Publisher URL https://doi.org/10.17743/jaes.2019.0035
Related Public URLs http://www.aes.org/journal/

Files






You might also like



Downloadable Citations