2016
DOI: 10.1007/s11071-016-3279-y
|View full text |Cite
|
Sign up to set email alerts
|

Design of fractional-order variants of complex LMS and NLMS algorithms for adaptive channel equalization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(24 citation statements)
references
References 44 publications
0
23
1
Order By: Relevance
“…The BB method, also known as the two-step gradient method, was initially proposed by Barzilai and Borwein (1988). It was applied to solve various unconstrained optimization problems (Dai et al, 2006;Nesterov, 2013;Tan et al, 2016) later. The method has an improved convergence rate compared with the gradient method.…”
Section: The Proposed Concurrent Mcma and Dd With Bb Methods The Bb Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The BB method, also known as the two-step gradient method, was initially proposed by Barzilai and Borwein (1988). It was applied to solve various unconstrained optimization problems (Dai et al, 2006;Nesterov, 2013;Tan et al, 2016) later. The method has an improved convergence rate compared with the gradient method.…”
Section: The Proposed Concurrent Mcma and Dd With Bb Methods The Bb Methodsmentioning
confidence: 99%
“…The MCMA changes the cost function of CMA (Shah et al, 2017 ) from real field to complex field, and its cost function is:…”
Section: Blind Equalizationmentioning
confidence: 99%
See 1 more Smart Citation
“…In the basic LMS algorithm, the value of μ usually remains constant, which is obviously not suitable for the iterative process. This gives rise to the normalized least mean square (NLMS) algorithm [26,27], an LMS algorithm with variable step size.…”
Section: Lms Algorithmmentioning
confidence: 99%
“…The least-mean-square (LMS) algorithm is widely used in many fields, such as system identification [1,2], echo cancellation [3], adaptive channel equalization [4], adaptive antenna array [5], and adaptive spectral line enhancement [6], due to its good robustness, low computational complexity, and simple structure. However, it is well known that the convergence rate and steady-state error of the algorithm are directly related to the adaptation step size of a conventional LMS algorithm.…”
Section: Introductionmentioning
confidence: 99%