Abstract. In this paper, we present a novel approach to multi-view gender classification considering both shape and texture information to represent facial image. The face area is divided into small regions, from which local binary pattern(LBP) histograms are extracted and concatenated into a single vector efficiently representing the facial image. The classification is performed by using support vector machines(SVMs), which had been shown to be superior to traditional pattern classifiers in gender classification problem. The experiments clearly show the superiority of the proposed method over support gray faces on the CAS-PEAL face database and a highest correct classification rate of 96.75% is obtained. In addition, the simplicity of the proposed method leads to very fast feature extraction, and the regional histograms and global description of the face allow for multi-view gender classification.
In this paper, we present a novel method for multi-view gender classification considering both shape and texture information to represent facial images. The face area is divided into small regions from which local binary pattern (LBP) histograms are extracted and concatenated into a single vector efficiently representing a facial image. Following the idea of local binary pattern, we propose a new feature extraction approach called multi-resolution LBP, which can retain both fine and coarse local micro-patterns and spatial information of facial images. The classification tasks in this work are performed by support vector machines (SVMs). The experiments clearly show the superiority of the proposed method over both support gray faces and support Gabor faces on the CAS-PEAL face database. A higher correct classification rate of 96.56% and a higher cross validation average accuracy of 95.78% have been obtained. In addition, the simplicity of the proposed method leads to very fast feature extraction, and the regional histograms and fine-to-coarse description of facial images allow for multi-view gender classification.
Abstract. Considering the fast respond and high generalization accuracy of the min-max modular support vector machine (M 3 -SVM), we apply M 3 -SVM to solving the gender recognition problem and propose a novel task decomposition method in this paper. Firstly, we extract features from the face images by using a facial point detection and Gabor wavelet transform method. Then we divide the training data set into several subsets with the 'part-versus-part' task decomposition method. The most important advantage of the proposed task decomposition method over existing random method is that the explicit prior knowledge about ages contained in the face images is used in task decomposition. We perform simulations on a real-world gender data set and compare the performance of the traditional SVMs and that of M 3 -SVM with the proposed task decomposition method. The experimental results indicate that M 3 -SVM with our new method have better performance than traditional SVMs and M 3 -SVM with random task decomposition method.
A novel approach for no-reference video quality measurement is proposed in this paper. Firstly, various feature extraction methods are used to quantify the quality of videos. Then, a support vector regression model is trained and adopted to predict unseen samples. Six different regression models are compared with the support vector regression model. The experimental results indicate that the combination of different video quality features with a support vector regression model can outperform other methods for no-reference video quality measurement significantly.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.