2009
DOI: 10.1016/j.eswa.2008.09.066
|View full text |Cite
|
Sign up to set email alerts
|

Least squares twin support vector machines for pattern classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
38
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 545 publications
(40 citation statements)
references
References 9 publications
2
38
0
Order By: Relevance
“…Their difference is that TSVM uses simpler formulation than GEPSVM, and the former can be solved by merely two QP problems. Our results align with the finding in Kumar and Gopal [59], which says "generalization performance of TSVM is better than GEPSVM and conventional SVM". Nevertheless, Ding et al [60] claimed that TSVM has a lower generalization ability, so it is too early to make a decision about the classification performance of TSVM before more rigorous tests are implemented.…”
Section: Discussionsupporting
confidence: 80%
“…Their difference is that TSVM uses simpler formulation than GEPSVM, and the former can be solved by merely two QP problems. Our results align with the finding in Kumar and Gopal [59], which says "generalization performance of TSVM is better than GEPSVM and conventional SVM". Nevertheless, Ding et al [60] claimed that TSVM has a lower generalization ability, so it is too early to make a decision about the classification performance of TSVM before more rigorous tests are implemented.…”
Section: Discussionsupporting
confidence: 80%
“…The difference between them is TSVM uses simpler formulation than GEPSVM, and the former can be solved by merely two QP problems. Our results align with the finding in Kumar and Gopal [50], which says "generalization performance of TSVM is better than GEPSVM and conventional SVM". In following experiments, TSVM is the default classifier…”
Section: Classifier Comparisonsupporting
confidence: 80%
“…Therefore, TWSVM works faster than the standard SVM. Subsequently, there are many extensions for TWSVM including the improvements on TWSVM (TBSVM) [13], the least square TWSVM (LS-TWSVM) [1417], nonparallel plane proximal classifier (NPPC) [18], smooth TWSVM [19], geometric algorithm [20], and twin support vector regression (TWSVR) [21]. TWSVM was also extended to deal with multiclassification TWSVM [2224].…”
Section: Introductionmentioning
confidence: 99%
“…For binary classification, if the loss function is the hinge loss function, then the framework can become TWSVM [12] or TBSVM [13] with different parameters; if the loss function is the square loss function, then the framework is LS-TWSVM [14]; if the loss function is the convex combination of the linear and square loss functions, then the framework is NPPC [18]. Actually, we can also get smooth TWSVM [19] by replacing 2-norm with 1-norm in the framework.…”
Section: Introductionmentioning
confidence: 99%