2020
DOI: 10.1109/taffc.2018.2801811
|View full text |Cite
|
Sign up to set email alerts
|

Personalised, Multi-Modal, Affective State Detection for Hybrid Brain-Computer Music Interfacing

Abstract: Brain-computer music interfaces (BCMIs) may be used to modulate affective states, with applications in music therapy, composition, and entertainment. However, for such systems to work they need to be able to reliably detect their user's current affective state. We present a method for personalised affective state detection for use in BCMI. We compare it to a population-based detection method trained on 17 users and demonstrate that personalised affective state detection is significantly (p < 0.01) more accurat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1
1

Relationship

1
9

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 59 publications
0
9
0
Order By: Relevance
“… Daly et al (2016) developed an affective brain-computer music interfacing that can regulate users’ emotions, which has the potential to be used in music therapy and entertainment. Daly et al (2020) further proposed a personalized method that could be used for BCMI emotional state detection. Compared with population-based detection method, the accuracy of personalized emotion detection was significantly improved.…”
Section: Resultsmentioning
confidence: 99%
“… Daly et al (2016) developed an affective brain-computer music interfacing that can regulate users’ emotions, which has the potential to be used in music therapy and entertainment. Daly et al (2020) further proposed a personalized method that could be used for BCMI emotional state detection. Compared with population-based detection method, the accuracy of personalized emotion detection was significantly improved.…”
Section: Resultsmentioning
confidence: 99%
“…In the future we hope to build on our research exploring how sound may be used to modify the affective state of an individual [18], as well as our research which showed how state-of-the-art affective computing technology may be used to dynamically modulate sound and music over time to optimise an "affective trajectory" (a change in felt affect in an audience over time) [19], [20]. However, one of the most exciting aspects of this work is the set of questions it has raised for future investigation.…”
Section: Discussionmentioning
confidence: 99%
“…Most of the works of this group propose the development of interfaces for monitoring and modulating mental and affective states through music (LingHu & Shu, 2018;Mideska et al, 2016;Shan et al, 2018;Daly et al, 2016;S. K. Ehrlich et al, 2019;Kobayashi & Fujishiro, 2016;Daly et al, 2020). In Desai et al (2018)'s work they present an approach in the opposite direction.…”
Section: Non-invasive Assistive Solutions Based On Human-machine Musical Interfacesmentioning
confidence: 99%