The 2011 International Joint Conference on Neural Networks 2011
DOI: 10.1109/ijcnn.2011.6033473
|View full text |Cite
|
Sign up to set email alerts
|

Kernel adaptive filtering with maximum correntropy criterion

Abstract: Abstract-Kernel adaptive filters have drawn increasing attention due to their advantages such as universal nonlinear approximation with universal kernels, linearity and convexity in Reproducing Kernel Hilbert Space (RKHS). Among them, the kernel least mean square (KLMS) algorithm deserves particular attention because of its simplicity and sequential learning approach. Similar to most conventional adaptive filtering algorithms, the KLMS adopts the mean square error (MSE) as the adaptation cost. However, the mer… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
87
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 183 publications
(87 citation statements)
references
References 16 publications
(18 reference statements)
0
87
0
Order By: Relevance
“…Using the ideas of both kernel least mean square (KLMS) [10] and maximum correntropy criterion [8], kernel maximum correntropy (KMC) is introduced in supervised learning [9]. To maximize the error correntropy, we can use stochastic gradient ascent on the new cost function in the feature space.…”
Section: B Correntropy Kernel Temporal Differencesmentioning
confidence: 99%
See 2 more Smart Citations
“…Using the ideas of both kernel least mean square (KLMS) [10] and maximum correntropy criterion [8], kernel maximum correntropy (KMC) is introduced in supervised learning [9]. To maximize the error correntropy, we can use stochastic gradient ascent on the new cost function in the feature space.…”
Section: B Correntropy Kernel Temporal Differencesmentioning
confidence: 99%
“…Maximum correntropy criterion (MCC) has been applied to obtain robust methods for adaptive systems in supervised learning [8], [9]. Using MCC, a system can be adapted in such a way that a similarity measure between desired and predicted signals is maximized.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, the corresponding maximum correntropy criterion (MCC) has been used as a cost function to derive various robust nonlinear adaptive filtering algorithms, such as the kernel maximum correntropy (KMC) algorithm [36] and the kernel recursive maximum correntropy (KRMC) algorithm [27], and they show better performance than those classical second-order schemes, e.g., KLMS and KRLS. However, correntropic loss (C-Loss) is a non-convex function, and may converge slowly, especially when the initial value is far away from the optimal value.…”
Section: Introductionmentioning
confidence: 99%
“…The MCC aims at maximizing the similarity (measured by correntropy) between the model output and the desired response such that the adaptive model is as close as possible to the unknown system. It has been shown that, the MCC in terms of the stability and accuracy, is very robust with respect to impulsive noises [33][34][35][36][37][38][39]. Compared with the traditional Hammerstein adaptive filtering algorithms based on the MSE criterion, the new algorithm can achieve better performance especially in the presence of impulsive non-Gaussian noises.…”
Section: Introductionmentioning
confidence: 99%