Personalised, multi-modal, affective state detection for hybrid brain-computer music interfacing

Daly, I, Williams, DAH ORCID:, Malik, A, Weaver, J, Kirke, A, Hwang, F, Miranda, E and Nasuto, SJ 2020, 'Personalised, multi-modal, affective state detection for hybrid brain-computer music interfacing' , IEEE Transactions on Affective Computing, 11 (1) , pp. 111-124.

Full text not available from this repository. (Request a copy)


Brain-computer music interfaces (BCMIs) may be used to modulate affective states, with applications in music therapy, composition, and entertainment. However, for such systems to work they need to be able to reliably detect their user's current affective state. We present a method for personalised affective state detection for use in BCMI. We compare it to a population-based detection method trained on 17 users and demonstrate that personalised affective state detection is significantly (p < 0.01) more accurate, with average improvements in accuracy of 10.2 % for valence and 9.3 % for arousal. We also compare a hybrid BCMI (a BCMI that combines physiological signals with neurological signals) to a conventional BCMI design (one based upon the use of only EEG features) and demonstrate that the hybrid design results in a significant (p < 0.01) 6.2 % improvement in performance for arousal classification and a significant (p < 0.01) 5.9 % improvement for valence classification.

Item Type: Article
Schools: Schools > School of Computing, Science and Engineering
Journal or Publication Title: IEEE Transactions on Affective Computing
Publisher: IEEE
ISSN: 2371-9850
Related URLs:
Depositing User: USIR Admin
Date Deposited: 12 Dec 2019 14:30
Last Modified: 27 Aug 2021 21:33

Actions (login required)

Edit record (repository staff only) Edit record (repository staff only)