User-guided rendering of audio objects using an interactive genetic algorithm

Wilson, AD ORCID: and Fazenda, BM ORCID: 2019, 'User-guided rendering of audio objects using an interactive genetic algorithm' , Journal of the Audio Engineering Society, 67 (7/8) , pp. 522-530.

PDF - Published Version
Download (477kB) | Preview
[img] PDF - Accepted Version
Restricted to Repository staff only

Download (470kB) | Request a copy


Object-based audio allows for personalisation of content, perhaps to improve accessibility or to increase quality of experience more generally. This paper describes the design and evaluation of an interactive audio renderer, which is used to optimise an audio mix based on the feedback of the listener. A panel of 14 trained participants were recruited to trial the system. The range of audio mixes produced using the proposed system was comparable to the range of mixes achieved using a traditional fader-based mixing interface. Evaluation using the System Usability Scale showed a low level of physical and mental burden, making this a suitable interface for users with impairments, such as vision and/or mobility.

Item Type: Article
Schools: Schools > School of Computing, Science and Engineering > Salford Innovation Research Centre
Journal or Publication Title: Journal of the Audio Engineering Society
Publisher: Audio Engineering Society (AES)
ISSN: 1549-4950
Related URLs:
Depositing User: USIR Admin
Date Deposited: 07 Aug 2019 14:29
Last Modified: 29 Nov 2019 15:00

Actions (login required)

Edit record (repository staff only) Edit record (repository staff only)


Downloads per month over past year