Human emotions are an important part in daily life. In this paper, a novel multilayer network-based convolutional neural network (CNN) model is proposed for emotion recognition, from multi-channel nonlinear EEG signals. Firstly, in response to the multi-rhythm properties of brain, a multilayer brain network with five rhythm-based layers are derived, where each layer can pertinently describe one specific frequency band. Subsequently, a novel CNN model is carefully designed, which uses the multilayer brain network as input and allows deep learning of the classifiable nonlinear features from the channel and frequency views. Moreover, one DenseNet model is developed as another branch to study time-domain nonlinear features from the EEG signals. All the learned features are eventually concatenated together for emotion recognition. Publicly available SEED dataset is used to test the proposed method, and it shows good results on all 15 subjects, with average accuracy of 91.31%. Our method builds a bridge between the multilayer network and deep learning, suggesting an effective approach for analyzing multivariate nonlinear time series, especially multi-channel EEG signals.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.