2004
DOI: 10.1023/b:stco.0000035301.49549.88
|View full text |Cite
|
Sign up to set email alerts
|

A tutorial on support vector regression

Abstract: In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing with large datasets. Finally, we mention some modifications and extensions that have been applied to the standard SV algorithm, and discuss the aspect of regularization from a SV perspective.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
5,382
0
145

Year Published

2004
2004
2020
2020

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 9,966 publications
(5,529 citation statements)
references
References 133 publications
2
5,382
0
145
Order By: Relevance
“…For the prediction of the brain age, we established the framework using a support vector regression (SVR) algorithm (Smola & Schölkopf, 2004) because of its desirable characteristics and easy computation for a high‐dimensional feature space. The estimation model in the basis of SVR algorithm has been widely used in different neuroimaging studies (Dosenbach et al., 2011; Erus et al., 2015; Koutsouleris et al., 2014; Lancaster et al., 2018).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…For the prediction of the brain age, we established the framework using a support vector regression (SVR) algorithm (Smola & Schölkopf, 2004) because of its desirable characteristics and easy computation for a high‐dimensional feature space. The estimation model in the basis of SVR algorithm has been widely used in different neuroimaging studies (Dosenbach et al., 2011; Erus et al., 2015; Koutsouleris et al., 2014; Lancaster et al., 2018).…”
Section: Methodsmentioning
confidence: 99%
“…More information about the theoretical background of the SVR can be found in (Scholkopf et al., 1997; Smola & Schölkopf, 2004; Vapnik, 2013; Welling, 2004). …”
Section: Methodsmentioning
confidence: 99%
“…30,31 The formulation embodies the Structural Risk Minimization (SRM) principle, 28,29 which has been shown to be superior to the traditional Empirical Risk Minimization (ERM) principle, employed by conventional neural networks. SRM minimizes an upper bound on VC dimension ("generalization error"), as opposed to ERM that minimizes the error on the training data.…”
Section: Support Vector Machine 2829mentioning
confidence: 99%
“…This study is motivated by a growing popularity of support vector machines (SVM) for regression problems [3,[6][7][8][9][10][11][12][13][14]. Their practical successes can be attributed to solid theoretical foundations based on VC-theory [13,14], since SVM generalization performance does not depend on the dimensionality of the input space.…”
Section: Introductionmentioning
confidence: 99%