2001
DOI: 10.1109/72.950134
|View full text |Cite
|
Sign up to set email alerts
|

Weighted least squares training of support vector classifiers leading to compact and adaptive schemes

Abstract: An iterative block training method for support vector classifiers (SVCs) based on weighted least squares (WLS) optimization is presented. The algorithm, which minimizes structural risk in the primal space, is applicable to both linear and nonlinear machines. In some nonlinear cases, it is necessary to previously find a projection of data onto an intermediate-dimensional space by means of either principal component analysis or clustering techniques. The proposed approach yields very compact machines, the comple… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
48
0

Year Published

2004
2004
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(49 citation statements)
references
References 20 publications
1
48
0
Order By: Relevance
“…An important difference with pruning methods in classical neural networks (Bishop, 1995;Hassibi & Stork, 1993;Le Cun, Denker, & Solla, 1990), e.g., optimal brain damage and optimal brain surgeon, is that in the LS-SVM pruning procedure no inverse of a Hessian matrix has to be computed. The LS-SVM pruning procedure can also be related to Interior Point and IRWLS methods for SVMs (Navia-Vázquez et al, 2001;Smola, 1999), where a linear system of the same form as (10) is solved in each iteration step until the conditions for optimality and the resulting sparseness property of the SVM are obtained. In each step of the IRWLS solution the whole training set is still taken into account and the sparse SVM solution is obtained after convergence.…”
Section: Sparse Approximation Using Ls-svmsmentioning
confidence: 99%
See 1 more Smart Citation
“…An important difference with pruning methods in classical neural networks (Bishop, 1995;Hassibi & Stork, 1993;Le Cun, Denker, & Solla, 1990), e.g., optimal brain damage and optimal brain surgeon, is that in the LS-SVM pruning procedure no inverse of a Hessian matrix has to be computed. The LS-SVM pruning procedure can also be related to Interior Point and IRWLS methods for SVMs (Navia-Vázquez et al, 2001;Smola, 1999), where a linear system of the same form as (10) is solved in each iteration step until the conditions for optimality and the resulting sparseness property of the SVM are obtained. In each step of the IRWLS solution the whole training set is still taken into account and the sparse SVM solution is obtained after convergence.…”
Section: Sparse Approximation Using Ls-svmsmentioning
confidence: 99%
“…The QP-problem of the corresponding SVM formulation is typically solved by Interior Point (IP) methods (Cristianini & Shawe-Taylor, 2000;Smola, 1999), Sequential Minimal Optimization (SMO) (Platt, 1998) and iteratively reweighted least squares approaches (IRWLS) (Navia-Vázquez et al, 2001), while LS-SVMs (Suykens & Vandewalle, 1999b;Suykens et al, 2002;Van Gestel et al, 2001Viaene et al, 2001) result into a set of linear equations. Efficient iterative methods for solving large scale linear systems are available in numerical linear algebra (Golub & Van Loan, 1989).…”
Section: Introductionmentioning
confidence: 99%
“…This is one of the main computational problems of these algorithms that prevent their application in very large speech databases. However, some solutions are already being developed [36][37][38].…”
Section: Is a Nonlinear Function Which Maps Vectormentioning
confidence: 99%
“…The idea can also be extended easily to trained fusion schemes based on other classifiers. Worth noting, sequential algorithms to solve the SVM optimization problem in (2), (3) have already been proposed (Navia-Vazquez et al, 2001), and can be used to extend the proposed idea, first constructing the user-independent solution and then refining it by incorporating the local data.…”
Section: Global Local and Adapted Fusion Schemesmentioning
confidence: 99%