2021
DOI: 10.1016/j.sigpro.2021.108276
|View full text |Cite
|
Sign up to set email alerts
|

Robust and sparsity-aware adaptive filters: A Review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 65 publications
(5 citation statements)
references
References 293 publications
0
3
0
Order By: Relevance
“…However, PI algorithm using fixed step size must compromise between fast convergence rate and low level of steady-state MSE. Scholars have proposed many algorithms to reduce the steady state error with computational complexity [6][7][8][9][10][11][12][13] . Khaled Mayyas [6] proposed a VSS selective partial update LMS algorithm, to reduce computational complexity by updating only a fraction of the adaptive filter coefficients during each iteration.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, PI algorithm using fixed step size must compromise between fast convergence rate and low level of steady-state MSE. Scholars have proposed many algorithms to reduce the steady state error with computational complexity [6][7][8][9][10][11][12][13] . Khaled Mayyas [6] proposed a VSS selective partial update LMS algorithm, to reduce computational complexity by updating only a fraction of the adaptive filter coefficients during each iteration.…”
Section: Introductionmentioning
confidence: 99%
“…Yao Jiang et al [8] proposed an adaptive step-size approach, which utilized an improved particle swarm optimization (IPSO) algorithm to determine the key parameters for setting specific step-size adjustment strategies, but the algorithm complexity was large, which is not suitable for RF front-end. Zhang et al Krishna Kumar et al [9] introduced a family of VSS-LMS algorithms that utilize logarithmic, hyperbolic, and sigmoid cost functions, resulting in robust performance.…”
Section: Introductionmentioning
confidence: 99%
“…Typical examples include the -norm with [ 9 ], the M -estimation theory [ 10 ], and the information theoretic learning (ITL) family [ 11 ]. More details about robust adaptive signal processing schemes can be seen in a review article [ 12 ]. In the family of ITL criteria, thanks to all the even-order moment information of the error signal contained in the minimization of error entropy (MEE) [ 13 ] and the maximum correntropy criterion (MCC) [ 14 , 15 ], they are widely used in robust signal processing and machine learning.…”
Section: Introductionmentioning
confidence: 99%
“…In NLMS, the step size is used to normalize l2‐norm of the input signal vector in LMS with substantial fluctuation [12]. Numerous sparse algorithms for adaptive filters, such as proportionate NLMS (PNLMS) algorithms [11] and the zero‐attracting (ZA) algorithm [13, 14], have been proposed to exploit this sparsity.…”
Section: Introductionmentioning
confidence: 99%