2021
DOI: 10.1109/tsipn.2020.3046217
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Graph Filters in Reproducing Kernel Hilbert Spaces: Design and Performance Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 24 publications
(11 citation statements)
references
References 63 publications
(91 reference statements)
0
10
0
Order By: Relevance
“…By taking the limit of the expected absolute value of (22) as k → ∞ and using the same approximation for R p and R w in Section 4.1 , the spectral domain absolute error update can be expressed as…”
Section: Mean-absolute Deviations Stability Analysis Under Steady-sta...mentioning
confidence: 99%
See 1 more Smart Citation
“…By taking the limit of the expected absolute value of (22) as k → ∞ and using the same approximation for R p and R w in Section 4.1 , the spectral domain absolute error update can be expressed as…”
Section: Mean-absolute Deviations Stability Analysis Under Steady-sta...mentioning
confidence: 99%
“…Recent attention has focused on the distributed version of GSP algorithms [19,20] which merge the distributed GLMS algorithm with diffusion algorithms and [21] gives a distributed version of the GRLS algorithm. Similarly, distributed kernel GLMS was introduced in [22] which extended the linear graph adaptive algorithm to a nonlinear version. Recent studies have applied classical time-series analysis techniques, such as the ARMA models, to GSP [23] .…”
Section: Introductionmentioning
confidence: 99%
“…Then we can build Our analysis framework can be helpful for other algorithms which have a (shift-invariant-)kernel-based learning model. For example, GKLMS-RFF in [20] has the same learning model, but uses filtered nodal value time series by a known graph filter as the model input. Our analysis can explicitly point out how to configure the model in the GKLMS-RFF algorithm.…”
Section: Extension Of the Analysismentioning
confidence: 99%
“…Among many kinds of shift-invariant kernels [15], e.g., Gaussian kernels, Laplacian kernels, and Cauchy kernels, we will focus on Gaussian kernels. Noting that the kernel being used models how similarity changes with difference, and that the laws of large numbers indicate wide application of the Gaussian distribution, it is intuitive to use Gaussian kernels in most situations [16], [17], [20]. For this reason, in this paper we will discuss SKG with a Gaussian kernel in detail.…”
Section: Introductionmentioning
confidence: 99%
“…Typically, state-of-the-art techniques address the case where both reference and target signals share the same graph. In the context of linear system modeling, different learning problems have been studied within the GSP framework, e.g., classification on graphs [19], autoregressive models for graph signal prediction [15], [16], [20], dictionary learning [21], and distributed adaptive filtering [8]- [14]. Several learning strategies have been proposed for the nonlinear setting as well.…”
Section: Introductionmentioning
confidence: 99%