2020
DOI: 10.1016/j.acha.2019.09.001
|View full text |Cite
|
Sign up to set email alerts
|

Learning with correntropy-induced losses for regression with mixture of symmetric stable noise

Abstract: In recent years, correntropy and its applications in machine learning have been drawing continuous attention owing to its merits in dealing with non-Gaussian noise and outliers. However, theoretical understanding of correntropy, especially in the learning theory context, is still limited. In this study, we investigate correntropy based regression in the presence of non-Gaussian noise or outliers within the statistical learning framework. Motivated by the practical way of generating non-Gaussian noise or outlie… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 54 publications
0
9
0
Order By: Relevance
“…As a result, imposing stronger moment conditions may not help improve the established convergence rates of the estimator, implying the existence of an inherent bias in mean regression. These novel insights can help cement the theoretical correntropy framework developed recently in Feng et al (2015Feng & Ying, 2020;Feng & Wu, 2020).…”
Section: New Insights Brought By This Studymentioning
confidence: 78%
See 2 more Smart Citations
“…As a result, imposing stronger moment conditions may not help improve the established convergence rates of the estimator, implying the existence of an inherent bias in mean regression. These novel insights can help cement the theoretical correntropy framework developed recently in Feng et al (2015Feng & Ying, 2020;Feng & Wu, 2020).…”
Section: New Insights Brought By This Studymentioning
confidence: 78%
“…Therefore, under the zero median assumption, f is in fact the conditional median function as the conditional mean function may not even be defined. According to Feng and Ying (2020), in this case, MCCR can learn the conditional median function f well in the sense that Ep ε|X (Y − f z,σ (X )) → Ep ε|X (Y − f (X )) implies f z,σ → f with a proper fixed σ . Moreover, fast exponential-type convergence rates can be established.…”
Section: Learning With Mccr For Median Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…First, our distributed result provides the optimal rates by requiring a large robust parameter σ. In practice, a moderate σ may be enough to ensure a good learning performance in robust estimation as shown by [17]. It is therefore of interest to investigate the convergence properties of distributed version of algorithm (5) when σ is chosen as a constant or σ(N) → 0 as N approaches ∞.…”
Section: Discussionmentioning
confidence: 99%
“…The commonly used robust losses mainly include adaptive Huber loss [11], gain function [12], minimum error entropy [13], exponential squared loss [14], etc. Among them, the Maximum Correntropy Criterion (MCC) is widely employed as an efficient alternative to the ordinary least squares method which is suboptimal in the non-Gaussian and non-linear signal processing situations [15][16][17][18][19]. Recently, MCC has been studied extensively in the literature and is widely adopted for many learning tasks, e.g., wind power forecasting [20] and pattern recognition [19].…”
Section: Introductionmentioning
confidence: 99%