2006
DOI: 10.1214/088342306000000493
|View full text |Cite
|
Sign up to set email alerts
|

Support Vector Machines with Applications

Abstract: Support vector machines (SVMs) appeared in the early nineties as optimal margin classifiers in the context of Vapnik's statistical learning theory. Since then SVMs have been successfully applied to real-world data analysis problems, often providing improved results compared with other techniques. The SVMs operate within the framework of regularization theory by minimizing an empirical risk in a well-posed and consistent way. A clear advantage of the support vector approach is that sparse solutions to classific… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
122
0
2

Year Published

2009
2009
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 215 publications
(124 citation statements)
references
References 68 publications
0
122
0
2
Order By: Relevance
“…In particular, the values ε = 0.0001, σ = 40 and C = 1.8, have been chosen for the SVM implementation. Additionally, in this application, the values for the corresponding dimensions in the SVM model are n = 1, m = ∞ (given that this is the dimension induced by the Gaussian kernel, see Moguerza and Muñoz 2006) and p = 34, that is, the number of data within each set. We should notice here again that we use the same set of parameter values for all the data sets.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…In particular, the values ε = 0.0001, σ = 40 and C = 1.8, have been chosen for the SVM implementation. Additionally, in this application, the values for the corresponding dimensions in the SVM model are n = 1, m = ∞ (given that this is the dimension induced by the Gaussian kernel, see Moguerza and Muñoz 2006) and p = 34, that is, the number of data within each set. We should notice here again that we use the same set of parameter values for all the data sets.…”
Section: Resultsmentioning
confidence: 99%
“…It can be shown (Moguerza and Muñoz 2006) that the regularization problem can be formulated as a convex quadractic optimization problem (therefore, without local minima) of the form:…”
Section: Geometrical Interpretation Of Support Vector Machinesmentioning
confidence: 99%
See 2 more Smart Citations
“…This geometrical optimization problem can be written as a convex quadratic optimization problem with linear constraints, in principle solvable by any nonlinear optimization procedure. See also [18,40,108,177,113] for introductory surveys on SVMs.…”
Section: Introductionmentioning
confidence: 99%