2022
DOI: 10.1109/taffc.2020.2982143
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Learning in Reservoir Computing for EEG-Based Emotion Recognition

Abstract: In real-world applications such as emotion recognition from recorded brain activity, data are captured from electrodes over time. These signals constitute a multidimensional time series. In this paper, Echo State Network (ESN), a recurrent neural network with a great success in time series prediction and classification, is optimized with different neural plasticity rules for classification of emotions based on electroencephalogram (EEG) time series. Actually, the neural plasticity rules are a kind of unsupervi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 43 publications
(10 citation statements)
references
References 59 publications
0
10
0
Order By: Relevance
“…Over the last years, deep learning networks were used for EEG classification tasks, including seizure detection [19], emotion recognition [20] [21] [22], and classification of motor (imagery) tasks. Various studies showed that LSTMs outperform other models like decision trees, support vector machines used in our previous work [23], logistic regressions, random forest classifiers, naïve Bayes, feedforward neural networks, deep belief networks, and even CNNs for some tasks [24].…”
Section: A Lstm Modelmentioning
confidence: 99%
“…Over the last years, deep learning networks were used for EEG classification tasks, including seizure detection [19], emotion recognition [20] [21] [22], and classification of motor (imagery) tasks. Various studies showed that LSTMs outperform other models like decision trees, support vector machines used in our previous work [23], logistic regressions, random forest classifiers, naïve Bayes, feedforward neural networks, deep belief networks, and even CNNs for some tasks [24].…”
Section: A Lstm Modelmentioning
confidence: 99%
“…#5, we consider two multi-class emotion recognition tasks based on the valence, arousal, and dominance levels. Each emotion coordinate's high/low level in the valencearousal-dominance model could be mapped into the Plutchik Wheel emotion model [31] as mentioned above. Here we consider such two tasks from the involved dataset: the 4class classification in DEAP and the 8-class classification in DREAMER.…”
Section: G Exp #5: Emotion Recognition With Multi-class Assessmentsmentioning
confidence: 99%
“…Recurrent models like reservoir computing [31], attentionbased convolutional recurrent neural network [32] was also involved in the EEG-based affection computing.…”
Section: Introductionmentioning
confidence: 99%
“…Some conventional methods extract important features from signals by using ESNs for the purpose of anomaly detection [17]- [21]. A part of them extract and classify features simultaneously [22], [23]. However, most of such systems need an additional classifier, which spoils the advantage of the short training time in the ESNs [24].…”
Section: Introductionmentioning
confidence: 99%