1993
DOI: 10.1109/78.236504
|View full text |Cite
|
Sign up to set email alerts
|

On the convergence behavior of the LMS and the normalized LMS algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
201
2
9

Year Published

1996
1996
2013
2013

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 420 publications
(216 citation statements)
references
References 31 publications
4
201
2
9
Order By: Relevance
“…Comparing (13) and the result in [10, Eq. 22], it can be found that they are identical except that all the integrals in [10] are coupled in their original forms.…”
Section: Mean Square Behaviormentioning
confidence: 99%
See 1 more Smart Citation
“…Comparing (13) and the result in [10, Eq. 22], it can be found that they are identical except that all the integrals in [10] are coupled in their original forms.…”
Section: Mean Square Behaviormentioning
confidence: 99%
“…Consequently, the works in [9,10] only concentrated on certain special cases of eigenvalue distribution of the input autocorrelation matrix. In [12,13], particular or simplified input data model was introduced to facilitate the performance analysis so that useful analytical expressions can still be derived. In [14], the averaging principle was invoked to simplify the expectations involved in the difference equations.…”
Section: Introductionmentioning
confidence: 99%
“…There has been research in the past focusing on the comparison between the LMS and the NLMS algorithms [22] - [24]. In 1993, Slock [24] studied the convergence behavior of both the algorithms and concluded that the NLMS algorithm is a potentially faster converging algorithm compared to the LMS algorithm.…”
Section: ε-Nlms Algorithmmentioning
confidence: 99%
“…In 1993, Slock [24] studied the convergence behavior of both the algorithms and concluded that the NLMS algorithm is a potentially faster converging algorithm compared to the LMS algorithm. However, faster convergence comes at a cost of high computational complexity.…”
Section: ε-Nlms Algorithmmentioning
confidence: 99%
“…However, for fixed schedules, the NLMS algorithm is known to trade off transient performance for asymptotic performance. Slock suggested an "optimal" step-size to be used with NLMS [12]. Following a heuristic analysis he concludes that NLMS performs the same as RLS as far as sensitivity to the eigenvalue spread is concerned.…”
Section: Introductionmentioning
confidence: 99%