AD Wilson
User-guided rendering of audio objects using an interactive genetic algorithm
Wilson, AD; Fazenda, BM
Abstract
Object-based audio allows for personalisation of content, perhaps to improve accessibility
or to increase quality of experience more generally. This paper describes the design and evaluation
of an interactive audio renderer, which is used to optimise an audio mix based on the
feedback of the listener. A panel of 14 trained participants were recruited to trial the system.
The range of audio mixes produced using the proposed system was comparable to the range
of mixes achieved using a traditional fader-based mixing interface. Evaluation using the System
Usability Scale showed a low level of physical and mental burden, making this a suitable
interface for users with impairments, such as vision and/or mobility.
Journal Article Type | Article |
---|---|
Acceptance Date | Jul 17, 2019 |
Publication Date | Aug 14, 2019 |
Deposit Date | Aug 7, 2019 |
Publicly Available Date | Oct 10, 2019 |
Journal | Journal of the Audio Engineering Society |
Print ISSN | 1549-4950 |
Publisher | Audio Engineering Society |
Volume | 67 |
Issue | 7/8 |
Pages | 522-530 |
DOI | https://doi.org/10.17743/jaes.2019.0035 |
Publisher URL | https://doi.org/10.17743/jaes.2019.0035 |
Related Public URLs | http://www.aes.org/journal/ |
Files
20490.pdf
(478 Kb)
PDF
You might also like
Using scale modelling to assess the prehistoric acoustics of stonehenge
(2020)
Journal Article
Misleading description of first and second order ambisonic systems
(2020)
Journal Article
Pupil dilation reveals changes in listening effort due to energetic and informational masking
(2019)
Presentation / Conference
Adding the room to the mix : perceptual aspects of modal resonance in live audio
(2019)
Presentation / Conference