Affective computing is concerned with simulating people’s psychological cognitive processes, of which emotion classification is an important part. Electroencephalogram (EEG), as an electrophysiological indicator capable of recording brain activity, is portable and non-invasive. It has emerged as an essential measurement method in the study of emotion classification. EEG signals are typically split into different frequency bands based on rhythmic characteristics. Most of machine learning methods combine multiple frequency band features into a single feature vector. This strategy is incapable of utilizing the complementary and consistent information of each frequency band effectively. It does not always achieve the satisfactory results. To obtain the sparse and consistent representation of the multi-frequency band EEG signals for emotion classification, this paper propose a multi-frequent band collaborative classification method based on optimal projection and shared dictionary learning (called MBCC). The joint learning model of dictionary learning and subspace learning is introduced in this method. MBCC maps multi-frequent band data into the subspaces of the same dimension using projection matrices, which are composed of a common shared component and a band-specific component. This projection method can not only make full use of the relevant information across multiple frequency bands, but it can also maintain consistency across each frequency band. Based on dictionary learning, the subspace learns the correlation between frequency bands using Fisher criterion and principal component analysis (PCA)-like regularization term, resulting in a strong discriminative model. The objective function of MBCC is solved by an iterative optimization algorithm. Experiment results on public datasets SEED and DEAP verify the effectiveness of the proposed method.