2013
DOI: 10.1016/j.patcog.2012.06.019
|View full text |Cite
|
Sign up to set email alerts
|

Robust twin support vector machine for pattern classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
76
0
1

Year Published

2014
2014
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 273 publications
(77 citation statements)
references
References 30 publications
0
76
0
1
Order By: Relevance
“…which acts as decision boundary and is evaluated thereby using the function Φ that maps which is in higher dimension [14]. The distance is maximized for the set of data points consistent on the training set with hyperplane characterized by ( , ) w b .…”
Section: Support Vector Machines With Kernel Evaluationmentioning
confidence: 99%
See 1 more Smart Citation
“…which acts as decision boundary and is evaluated thereby using the function Φ that maps which is in higher dimension [14]. The distance is maximized for the set of data points consistent on the training set with hyperplane characterized by ( , ) w b .…”
Section: Support Vector Machines With Kernel Evaluationmentioning
confidence: 99%
“…Unauthenticated Download Date | 5/11/18 4:44 PM which acts as decision boundary and is evaluated thereby using the function  that maps x to S space which is in higher dimension [14]. The distance is maximized for the set of data points ( ) The solution to the above problem is established using the Lagrangian formulation and it is shown that …”
Section: Support Vector Machines With Kernel Evaluationmentioning
confidence: 99%
“…If ε-insensitive loss function is utilized, errors between -and + are going to be ignored. If C=Inf is set, regression curve will follow the training data inside the margin which is determined by (Qi, Tian & Shi, 2013). The related equation can be observed in (6).…”
Section: Svr (Support Vector Regression)mentioning
confidence: 99%
“…There are many methods of constructing the robust SVMs such as [128,129]. Robust TWSVM (R-TWSVM) [130] via second order cone programming (SOCP) formulations for classification was an improved extension of TWSVM, since there were only inner products about inputs in the dual problems which kernel trick can bey directly applied for nonlinear cases and the inverse of matrices were not needed any more. The prior knowledge in the form of multiple polyhedral sets were incorporated into the linear TWSVM and LSTWSVM, termed as knowledge based TWSVM (KBTWSVM) and knowledge based LSTWSVM (KBLSTWSVM), were formulated in [131].…”
Section: Knowledge-based Twsvmsmentioning
confidence: 99%