1990
DOI: 10.1109/31.62411
|View full text |Cite
|
Sign up to set email alerts
|

Analysis and design of a signed regressor LMS algorithm for stationary and nonstationary adaptive filtering with correlated Gaussian data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
51
0
2

Year Published

1996
1996
2018
2018

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 100 publications
(55 citation statements)
references
References 17 publications
2
51
0
2
Order By: Relevance
“…It is customary to introduce simplifying assumptions that tend to lead to the reasonable agreements between theory and practice [1][2][3]. Many contributions focus on a particular algorithm, making more or less restrictive assumptions on the input signal [4][5][6][7][8]. Obviously, a more general analysis encompassing as many different algorithms as possible as special cases, while at the same time making as few restrictive assumptions as possible, is highly desirable [9,10].…”
Section: Introductionmentioning
confidence: 99%
“…It is customary to introduce simplifying assumptions that tend to lead to the reasonable agreements between theory and practice [1][2][3]. Many contributions focus on a particular algorithm, making more or less restrictive assumptions on the input signal [4][5][6][7][8]. Obviously, a more general analysis encompassing as many different algorithms as possible as special cases, while at the same time making as few restrictive assumptions as possible, is highly desirable [9,10].…”
Section: Introductionmentioning
confidence: 99%
“…The correction applied to the weight vector w(n) at iteration n+1 is "normalized" with respect to the squared Euclidian norm of the input vector x(n) at iteration n. We may view the NLMS algorithm as a timevarying step-size algorithm, calculating the convergence factor μ as in Eq. (4) [7].…”
Section: Nlms Algorithmmentioning
confidence: 99%
“…Where ( ) is a regularized step size with 0 < <2.Substituting Qin the SRLMS [27][28] weight vector fill in equation with Q(d) directs to the NLMS, which is known as …”
Section: Sign Regressor Lms (Srlms) Techniquementioning
confidence: 99%