Emotion recognition has become a research focus in the brain–computer interface and cognitive neuroscience. Electroencephalogram (EEG) is employed for its advantages as accurate, objective, and noninvasive nature. However, many existing research only focus on extracting the time and frequency domain features of the EEG signals while failing to utilize the dynamic temporal changes and the positional relationships between different electrode channels. To fill this gap, we develop the dynamic differential entropy and brain connectivity features based EEG emotion recognition using linear graph convolutional network named DDELGCN. First, the dynamic differential entropy feature which represents the frequency domain feature as well as time domain feature is extracted based on the traditional differential entropy feature. Second, brain connectivity matrices are constructed by calculating the Pearson correlation coefficient, phase‐locked value and transfer entropy, and then are used to denote the connectivity features of all electrode combinations. Finally, a linear graph convolutional network is customized and applied to aggregate the features from total electrode combinations and then classifies the emotional states, which consists of five layers, namely, an input layer, two linear graph convolutional layers, a fully connected layer, and a softmax layer. Extensive experiments show that the accuracies in the valence and arousal dimensions reach 90.88% and 91.13%, and the precision reaches 96.66% and 97.02% on the DEAP dataset, respectively. On the SEED dataset, the accuracy and precision reach 91.56% and 97.38%, respectively.
Objective: emotion recognition on the basis of electroencephalography (EEG) signals has received a significant amount of attention in the areas of cognitive science and human-computer interaction (HCI). However, most existing studies either focus on one-dimensional EEG data ignoring the relationship between channels, or only extract time-frequency features while not involving spatial features. Approach: we develop spatial-temporal features based EEG emotion recognition using graph convolution network (GCN) and long short-term memory (LSTM) named ERGL. Firstly, the one dimensional EEG vector is converted into a two-dimensional mesh matrix, so that the matrix configuration corresponds to the distribution of brain regions at EEG electrode locations, thus to represent the spatial correlation between multiple adjacent channels in a better way. Secondly, the GCN and LSTM are employed together to extract spatial-temporal features, and GCN is used to extract spatial features, while LSTM units are applied to extract temporal features. Finally, the softmax layer is applied to emotion classification. Main results: extensive experiments are conducted on the DEAP and SEED datasets. The classification results of accuracy, precision and F-score for valence and arousal dimensions on DEAP achieved 90.67% and 90.33%, 92.38% and 91.72% and 91.34% and 90.86%, respectively. The accuracy, precision and F-score of positive, neutral and negative classification reached 94.92%, 95.34% and 94.17%, respectively on SEED dataset. Significance: the above results demonstrate the ERGL method is encouraging in comparison to the state-of-the-art recognition researches.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.