2000
DOI: 10.1017/cbo9780511801389
|View full text |Cite
|
Sign up to set email alerts
|

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

13
6,717
1
150

Year Published

2001
2001
2015
2015

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 10,420 publications
(6,881 citation statements)
references
References 0 publications
13
6,717
1
150
Order By: Relevance
“…However, the loss function used by e-insensitivity SVM, only penalizes errors greater than a threshold e. This leads to a sparse representation of the decision rule giving significant algorithmic and representation advantages. 10 On the other hand, the ridge regression (e = 0) used by LS-SVM typically causes the loss of sparseness representation.…”
Section: Introductionmentioning
confidence: 99%
“…However, the loss function used by e-insensitivity SVM, only penalizes errors greater than a threshold e. This leads to a sparse representation of the decision rule giving significant algorithmic and representation advantages. 10 On the other hand, the ridge regression (e = 0) used by LS-SVM typically causes the loss of sparseness representation.…”
Section: Introductionmentioning
confidence: 99%
“…For the classifier, we have used the well-known support vector machine (SVM) (Cristianini & Shawe-Taylor, 2000). It is a two class classifier which aims at finding the hyperplane that separates the training patterns of two classes by maximizing the distance between the hyperplane and the two classes.…”
Section: Classification Systemmentioning
confidence: 99%
“…The extensive reviews of SVM methods can be found elsewhere (Bennett et al, 2000;Cristianini and Shawe-Taylor, 2000). The SVM method has a number of interesting properties, including an effective avoidance of overfitting, which improves its ability to build models using large numbers of molecular property descriptors with relatively few experimental results in the training set.…”
Section: Support Vector Machine (Svm) Modelingmentioning
confidence: 99%