2018
DOI: 10.1109/tcsii.2017.2778038
|View full text |Cite
|
Sign up to set email alerts
|

Recursive Maximum Correntropy Learning Algorithm With Adaptive Kernel Size

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 34 publications
0
14
0
Order By: Relevance
“…It indeed determines the magnitude of the weights assigned to each error sample and it is a function of error. Optimizing this bandwidth has been widely addressed in previous work, for instance by minimizing Kullback-Leibler divergence between the true and estimated error distribution, using shape of error distribution measured by its kurtosis, using instantaneous error in each iteration, changing the Gaussian kernel, using hybrid methods and so forth [34,35,36,37,38,39,40,41].…”
Section: A Overview Of MCC and Meementioning
confidence: 99%
“…It indeed determines the magnitude of the weights assigned to each error sample and it is a function of error. Optimizing this bandwidth has been widely addressed in previous work, for instance by minimizing Kullback-Leibler divergence between the true and estimated error distribution, using shape of error distribution measured by its kurtosis, using instantaneous error in each iteration, changing the Gaussian kernel, using hybrid methods and so forth [34,35,36,37,38,39,40,41].…”
Section: A Overview Of MCC and Meementioning
confidence: 99%
“…Therefore, some robustness criteria have been proposed and successfully applied to adaptive filtering algorithms to deal with adaptive signal problems under impulsive noise, such as adaptive wireless channel tracking [11] and blind source decomposition [12]. Some typical robustness criteria include maximum correntropy criterion (MCC) [13][14][15], minimum error entropy (MEE) [16][17][18], generalized MCC [19] and minimum kernel risk-sensitive loss criterion [20]. They are insensitive to large outliers, which improves the performance under impulsive noise.…”
Section: Introductionmentioning
confidence: 99%
“…When impulsive noise appears, by incorporating the step-size scaler into the update term, a robust subband algorithm was developed [13]. The correntropy measures the similarity between two variables, which is helpful for suppressing large outliers; thus, the maximum correntropy criterion (MCC) has been used for improving the anti-jamming capability of adaptive filters to impulsive noise, yielding the GD-based MCC [14]- [16] and recursive MCC (RMCC) algorithms [17], [18]. However, these robust recursive algorithms have also high complexity of O(M 2 ) ops.…”
Section: Introductionmentioning
confidence: 99%