2018
DOI: 10.1109/tnnls.2016.2637351
|View full text |Cite
|
Sign up to set email alerts
|

Robust C-Loss Kernel Classifiers

Abstract: The correntropy-induced loss (C-loss) function has the nice property of being robust to outliers. In this paper, we study the C-loss kernel classifier with the Tikhonov regularization term, which is used to avoid overfitting. After using the half-quadratic optimization algorithm, which converges much faster than the gradient optimization algorithm, we find out that the resulting C-loss kernel classifier is equivalent to an iterative weighted least square support vector machine (LS-SVM). This relationship helps… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 38 publications
(15 citation statements)
references
References 43 publications
0
15
0
Order By: Relevance
“…Moreover, in practice, the MCC can also be formulated as minimizing the following correntropy-induced loss (C-loss) function [36], [37]…”
Section: B Maximum Correntropy Criterionmentioning
confidence: 99%
“…Moreover, in practice, the MCC can also be formulated as minimizing the following correntropy-induced loss (C-loss) function [36], [37]…”
Section: B Maximum Correntropy Criterionmentioning
confidence: 99%
“…29is not a convex function, it can't be solved by a commonly used optimization method. According to the solution process in [23], we can effectively solve the optimization problem of nonconvex functions. (28)…”
Section: The Objective Function Of Csrgelmmentioning
confidence: 99%
“…Since the correntropy induced loss is a differentiable and smooth function, the gradient optimization algorithm can be employed [23]. However, the gradient-based optimization algorithm converges slowly, so we use the half-quadratic optimization algorithm to solve the optimization problem of CSRGELM.…”
Section: The Optimization Of Csrgelmmentioning
confidence: 99%
See 1 more Smart Citation
“…The clipped square function (also known as the skipped-mean loss) was also used in [21] to estimate view relations, and in [14] to perform robust image restoration. Similar approaches have been taken for clipped loss functions, where they have been used for robust feature selection [9], regression [23,17], classification [19,16,22], and robust principal component analysis [18].…”
Section: Introductionmentioning
confidence: 99%