2022
DOI: 10.1016/j.sigpro.2021.108410
|View full text |Cite
|
Sign up to set email alerts
|

A kernel recursive minimum error entropy adaptive filter

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…This gives an accurate idea of the expected result. Due to the way LMS works, there is no correlation between how ML is learned in theory and how it is learned in practice [67]. These theories aim for convergence, which occurs when repeated learning leads to a single outcome rather than many outcomes.…”
Section: Least-mean-square (Lms)mentioning
confidence: 99%
“…This gives an accurate idea of the expected result. Due to the way LMS works, there is no correlation between how ML is learned in theory and how it is learned in practice [67]. These theories aim for convergence, which occurs when repeated learning leads to a single outcome rather than many outcomes.…”
Section: Least-mean-square (Lms)mentioning
confidence: 99%
“…where C KRMEE and C QKRMEE are the computational complexity of one cycle of the KRMEE and QKRMEE algorithms; C com represents the computational complexity of formulas of the same form in both algorithms; C θ;KRMEE and C θ;QKRMEE are the computational complexity of φ L in (17) [27] and θ L;S in (32). The difference C d;MEE in computational complexity between the KRMEE and QKRMEE algorithms can be expressed as…”
Section: Computational Complexitymentioning
confidence: 99%
“…In the four aforementioned instances, the performance of the QKRGMEE and QKRMEE algorithms is compared with that of the KRLS [8], KRMC [35], KRMEE [27], and KRGMEE [28] algorithms. In Fig.…”
Section: Fig 1 Convergence Curves Under Different Scenariosmentioning
confidence: 99%
See 1 more Smart Citation
“…Entropy [17]- [21] is a more universal adaptive filtering criterion since it quantifies the average information intrinsic in a given PDF. The MSE can be expanded if information is used as an optimality criterion.…”
Section: Introductionmentioning
confidence: 99%