2020
DOI: 10.1109/taffc.2018.2840973
|View full text |Cite
|
Sign up to set email alerts
|

A Mutual Information Based Adaptive Windowing of Informative EEG for Emotion Recognition

Abstract: The version presented here may differ from the published version or, version of record, if you wish to cite this item you are advised to consult the publisher's version. Please see the 'permanent WRAP url' above for details on accessing the published version and note that access may require a subscription.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
34
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 84 publications
(34 citation statements)
references
References 53 publications
0
34
0
Order By: Relevance
“…This, in turn, indicates that our proposed method plays a positive role in emotion recognition applications. In another research [15], the highest percentage of accuracy among the studies conducted on emotion recognition was achieved. In other words, accuracy levels of 89.61% and 89.84% were reached in a two-level classification for the valence and arousal dimensions, respectively, while accuracy levels of 75.02% and 75.70% were achieved in a three-level classification for the arousal and valence dimensions, respectively.…”
Section: Comparison With Previous Studiesmentioning
confidence: 93%
See 1 more Smart Citation
“…This, in turn, indicates that our proposed method plays a positive role in emotion recognition applications. In another research [15], the highest percentage of accuracy among the studies conducted on emotion recognition was achieved. In other words, accuracy levels of 89.61% and 89.84% were reached in a two-level classification for the valence and arousal dimensions, respectively, while accuracy levels of 75.02% and 75.70% were achieved in a three-level classification for the arousal and valence dimensions, respectively.…”
Section: Comparison With Previous Studiesmentioning
confidence: 93%
“…In [14], a new technique is defined for classification of emotions in 4 binary classes in 2D arousal-valence space. In [15], the authors used 2400 different features for emotion recognition in both arousal and valence dimensions. They used many feature extraction techniques, and finally with 30 to 40 features they reported 89.84% and 89.61% accuracies in bilevel and 75.02% and 75.70% accuracies in multilevel arousal and valence classifications, respectively.…”
mentioning
confidence: 99%
“…Portioning training data and test data form the same subject and adapting the classifier on a specific subject makes the task subject dependent which has improved results. For instance, in [24] classification of statistical features extracted from DEAP dataset achieves 82.76% and 82.77% for 2 valence and arousal levels, respectively with k-Nearest Neighbor (k-NN). But, here the lack of generalization is the cost to pay.…”
Section: Related Work On Deap Datasetmentioning
confidence: 99%
“…Since the DEAP dataset is publicly available, it allows us to have a better perspective on the differences between the methods and on the variability in their performances. Piho and Tjahjadi [18] investigated reduced EEG data of emotions using mutual information-This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ This article has been accepted for publication in a future issue of this journal, but has not been fully edited.…”
Section: Related Workmentioning
confidence: 99%