2020
DOI: 10.1109/lsp.2020.3017106
|View full text |Cite
|
Sign up to set email alerts
|

Nearest Kronecker Product Decomposition Based Generalized Maximum Correntropy and Generalized Hyperbolic Secant Robust Adaptive Filters

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 45 publications
(10 citation statements)
references
References 24 publications
0
10
0
Order By: Relevance
“…Bernoulli process with a success probability of p q , and a (n) is a zero-mean Gaussian white noise process. The MKRHS algorithm is compared with various kernel adaptive filtering algorithms such as KLMS, MKRSL, KGHSF [17], P-KLLAD [18], KMCC, and KGMC. The optimal parameter selections for each algorithm are presented in Table 1.…”
Section: Nonlinear System Identificationmentioning
confidence: 99%
See 1 more Smart Citation
“…Bernoulli process with a success probability of p q , and a (n) is a zero-mean Gaussian white noise process. The MKRHS algorithm is compared with various kernel adaptive filtering algorithms such as KLMS, MKRSL, KGHSF [17], P-KLLAD [18], KMCC, and KGMC. The optimal parameter selections for each algorithm are presented in Table 1.…”
Section: Nonlinear System Identificationmentioning
confidence: 99%
“…The incorporation of the hyperbolic secant function into the adaptive filtering algorithm has demonstrated improved filter performance, including lower steady-state error, enhanced robustness, and better convergence [17].This study integrates the hyperbolic secant function with the Minimum Kernel Risk-Sensitive Loss algorithm, aiming to augment the efficacy of KAF algorithms in non-Gaussian impulsive noise settings. Concurrently, to further reduce the computational complexity of the algorithm, the vector quantization method is used to suppress the growth of its network size.…”
Section: Introductionmentioning
confidence: 99%
“…The correntropy or the logarithmic hyperbolic function is considered a candidate for the cost function in the switching algorithm for super-Gaussian noises due to their robustness [ 37 ]. In an endeavor to achieve lower steady-state misalignment, a generalized hyperbolic secant function as a robust norm and derive the generalized hyperbolic secant adaptive filter was proposed [ 38 ]. To address both Gaussian and non-Gaussian noises with a uniform expression, Liu and colleagues [ 39 ] proposed a novel HTCC algorithm by combining nonlinear function and mapping mode.…”
Section: Introductionmentioning
confidence: 99%
“…The algorithms combine the logarithmic hyperbolic cosine (LHC) as cost function for nonlinear system identification 11 , 12 . Based on logarithmic hyperbolic cosine (LHC) cost function, proposed novel cost function exponential hyperbolic cosine function (EHCF) 13 , generalized hyperbolic secant 14 , formed adaptive filtering. The above SAF type algorithms are carried out in time domain.…”
Section: Introductionmentioning
confidence: 99%
“…However, the frequency domain spline adaptive filtering is derived by minimising the squared value of the instantaneous error, unable to suppress non-Gaussian impulsive noises. According to the maximum correlation entropy criterion (MCC) combined with adaptive filtering 14 , 16 – 18 , combined with spline adaptive filtering 19 – 21 , the robustness of MCC is demonstrated. To suppression non-Gaussian impulsive noises along with having comparable operation time, a frequency domain maximum correntropy criterion spline adaptive filter (FDSAF-MCC) is developed in this paper.…”
Section: Introductionmentioning
confidence: 99%