2012
DOI: 10.1016/j.sigpro.2012.04.007
|View full text |Cite
|
Sign up to set email alerts
|

Mean square convergence analysis for kernel least mean square algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
3
2

Relationship

2
8

Authors

Journals

citations
Cited by 43 publications
(13 citation statements)
references
References 14 publications
0
13
0
Order By: Relevance
“…And the number of input vectors in training set is n. Hence, the computational complexity of entropy calculation is O(kn). Similarly, the learning complexity of KLMS is also O(kn) [34,35]. Consequently, it could be concluded that the complexity of E-KLMS is also O(kn).…”
Section: Analysis Of Complexitymentioning
confidence: 94%
“…And the number of input vectors in training set is n. Hence, the computational complexity of entropy calculation is O(kn). Similarly, the learning complexity of KLMS is also O(kn) [34,35]. Consequently, it could be concluded that the complexity of E-KLMS is also O(kn).…”
Section: Analysis Of Complexitymentioning
confidence: 94%
“…Note that Gaussian kernel is usually a default choice in ANF due to its universal approximating capability and numerical stability [20].…”
Section: B Kernel Methods For Adaptive Filteringmentioning
confidence: 99%
“…In this direction, we mention here the works by Al-Naffouri and Sayed [11,13], whose approach is based on the fundamental energy conservation relation (ECR). Recently, this relation has been extended into RKHS for analyzing the mean-square convergence performance of the kernel adaptive filters [12,14]. In this section, we study the mean-square convergence performance of the proposed KRMN algorithm, based on the ECR given in Lemma 1.…”
Section: Mean Square Convergence Analysismentioning
confidence: 99%