Abstract-In this paper, we propose a novel algorithm for dimensionality reduction that uses as a criterion the mutual information (MI) between the transformed data and their corresponding class labels. The MI is a powerful criterion that can be used as a proxy to the Bayes error rate. Furthermore, recent quadratic nonparametric implementations of MI are computationally efficient and do not require any prior assumptions about the class densities. We show that the quadratic nonparametric MI can be formulated as a kernel objective in the graph embedding framework. Moreover, we propose its linear equivalent as a novel linear dimensionality reduction algorithm. The derived methods are compared against the state-of-the-art dimensionality reduction algorithms with various classifiers and on various benchmark and real-life datasets. The experimental results show that nonparametric MI as an optimization objective for dimensionality reduction gives comparable and in most of the cases better results compared with other dimensionality reduction methods.
Error-Correcting Output Codes (ECOC) with subclasses reveal a common way to solve multi-class classification problems. According to this approach, a multiclass problem is decomposed into several binary ones based on the maximization of the mutual information (MI) between the classes and their respective labels. The MI is modelled through the fast quadratic mutual information (FQMI) procedure. However, FQMI is not applicable on large datasets due to its high algorithmic complexity. In this paper we propose Fisher's Linear Discriminant Ratio (FLDR) as an alternative decomposition criterion which is of much less computational complexity and achieves in most experiments conducted better classification performance. Furthermore, we compare FLDR against FQMI over the CohnKanade facial expression recognition dataset.
Abstract. Error Correcting Output Codes reveal an efficient strategy in dealing with multi-class classification problems. According to this technique, a multi-class problem is decomposed into several binary ones. On these created sub-problems we apply binary classifiers and then, by combining the acquired solutions, we are able to solve the initial multiclass problem. In this paper we consider the optimization of the Linear Discriminant Error Correcting Output Codes framework using Particle Swarm Optimization. In particular, we apply the Particle Swarm Optimization algorithm in order to optimally select the free parameters that control the split of the initial problem's classes into sub-classes. Moreover, by using the Support Vector Machine as classifier we can additionally apply the Particle Swarm Optimization algorithm to tune its free parameters. Our experimental results show that by applying Particle Swarm Optimization on the Sub-class Linear Discriminant Error Correcting Output Codes framework we get a significant improvement in the classification performance.
Abstract-In this paper we propose a novel dimensionality reduction method that is based on successive Laplacian SVM projections in orthogonal deflated subspaces. The proposed method, called Laplacian Support Vector Analysis, produces projection vectors, which capture the discriminant information that lies in the subspace orthogonal to the standard Laplacian SVMs. We show that the optimal vectors on these deflated subspaces can be computed by successively training a standard SVM with specially designed deflation kernels. The resulting normal vectors contain discriminative information that can be used for feature extraction. In our analysis, we derive an explicit form for the deflation matrix of the mapped features in both the initial and the Hilbert space by using the kernel trick and thus, we can handle linear and non-linear deflation transformations. Experimental results in several benchmark datasets illustrate the strength of our proposed algorithm.
Abstract-Error-Correcting Output Codes (ECOC) reveal a common way to model multi-class classification problems. According to this state of the art technique, a multi-class problem is decomposed into several binary ones. Additionally, on the ECOC framework we can apply the subclass technique (sub-ECOC), where by splitting the initial classes of the problem we create larger but easier to solve ECOC configurations. The multi-class problem's decomposition is achieved via a discriminant tree creation procedure. This discriminant tree's creation is controlled by a triplet of thresholds that define a set of user defined splitting standards. The selection of the thresholds plays a major role in the classification performance. In our work we show that by optimizing these thresholds via particle swarm optimization we improve significantly the classification performance. Moreover, using Support Vector Machines (SVMs) as classifiers we can optimize in the same time both the thresholds of sub-ECOC and the parameters C and σ of the SVMs, resulting in even better classification performance. Extensive experiments in both real and artificial data illustrate the superiority of the proposed approach in terms of performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.