2019
DOI: 10.1142/s0219691319500048
|View full text |Cite
|
Sign up to set email alerts
|

Convergence rate of SVM for kernel-based robust regression

Abstract: It is known that to alleviate the performance deterioration caused by the outliers, the robust support vector (SV) regression is proposed, which is essentially a convex optimization problem associated with a non-convex loss function. The theory analysis for its performance cannot be finished by the usual convex analysis approach. For a robust SV regression algorithm containing two homotopy parameters, a non-convex method is developed with the quasiconvex analysis theory and the error estimate is given. An expl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 30 publications
(23 reference statements)
0
3
0
Order By: Relevance
“…The quadratic loss function is not a strongly convex function, we have relaxed the strong convexity assumption in the literature [26]. And our method may be extended to some online pairwise learning algorithms with non-convex loss functions, e.g., the robust loss function in [62].…”
Section: Further Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The quadratic loss function is not a strongly convex function, we have relaxed the strong convexity assumption in the literature [26]. And our method may be extended to some online pairwise learning algorithms with non-convex loss functions, e.g., the robust loss function in [62].…”
Section: Further Discussionmentioning
confidence: 99%
“…In learning theory, we often assume the K-functionals satisfy certain decay, for example, we assume (see e.g. [53,55,56,62]).…”
Section: By (23) We Havementioning
confidence: 99%
“…Support vector machines (SVMs) are well-known for their effectiveness in classification and regression [ 10 , 25 , 26 ]. Shuhua et al [ 27 ] developed a technique for assessing the error in kernel regularized regression using a non-convex loss function, which minimizes the negative impact of outliers on its performance. Despite the experience of radiologists, predicting infections using medical imaging is challenging due to the lack of detailed disease knowledge.…”
Section: Introductionmentioning
confidence: 99%