2020
DOI: 10.1016/j.asoc.2020.106142
|View full text |Cite
|
Sign up to set email alerts
|

NPrSVM: Nonparallel sparse projection support vector machine with efficient algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 48 publications
0
8
0
Order By: Relevance
“…To derive the model of MPMSVC, firstly, we enhance MPMSVC in least squares sense via replacing the equality constraints with inequality in TPMSVM, which enjoys an effective learning procedure. Besides, to reduce the impact of outliers, L 1 -norm metric [31,[39][40][41] is further considered for MPMSVC, which can also improve the flexibility of model. For this purpose, our MPMSVC considers the following linear L 1 -norm loss function for each cluster plane-center…”
Section: A Model Formulationmentioning
confidence: 99%
See 2 more Smart Citations
“…To derive the model of MPMSVC, firstly, we enhance MPMSVC in least squares sense via replacing the equality constraints with inequality in TPMSVM, which enjoys an effective learning procedure. Besides, to reduce the impact of outliers, L 1 -norm metric [31,[39][40][41] is further considered for MPMSVC, which can also improve the flexibility of model. For this purpose, our MPMSVC considers the following linear L 1 -norm loss function for each cluster plane-center…”
Section: A Model Formulationmentioning
confidence: 99%
“…Similar to linear case, we optimize problem (40) via solving series of convex unconstrained quadratic problem (41) iteratively until converges. That is, the iteration solution u t+1 k is updated by…”
Section: Nonlinear Extensionmentioning
confidence: 99%
See 1 more Smart Citation
“…It has been widely used in many different studies in the literature with high precision. SVM is a learning approach based on maximum margin [24]. It tries to find optimal decision boundary through maximizing the margin between parallel support hyperplanes.…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…It tries to find optimal decision boundary through maximizing the margin between parallel support hyperplanes. It finds the global optimum because its objective function is quadratic and all constraints are linear [24]. Non-linear classifications are successfully handled through kernel functions used in the algorithm.…”
Section: Support Vector Machinesmentioning
confidence: 99%