2017 25th European Signal Processing Conference (EUSIPCO) 2017
DOI: 10.23919/eusipco.2017.8081496
|View full text |Cite
|
Sign up to set email alerts
|

Optimal sampling strategies for adaptive learning of graph signals

Abstract: Abstract-The aim of this paper is to propose optimal sampling strategies for adaptive learning of signals defined over graphs. Introducing a novel least mean square (LMS) estimation strategy with probabilistic sampling, we propose two different methods to select the sampling probability at each node, with the aim of optimizing the sampling rate, or the mean-square performance, while at the same time guaranteeing a prescribed learning rate. The resulting solutions naturally lead to sparse sampling probability v… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…In fact, many papers consider the use of optimal sampling strategies that minimize specific metrics of the reconstructed GS [10]- [11]. Particularly, this work employs online recovery strategies for reconstructing the original GS from its sampled noisy version, which can be seen as a time-varying interpolation matrix Φ[k], and a few optimal sampling strategies have already been proposed in this context [14,19], even covering time-varying probabilistic sampling [16,18]- [19].…”
Section: Graph Signal Processingmentioning
confidence: 99%
See 1 more Smart Citation
“…In fact, many papers consider the use of optimal sampling strategies that minimize specific metrics of the reconstructed GS [10]- [11]. Particularly, this work employs online recovery strategies for reconstructing the original GS from its sampled noisy version, which can be seen as a time-varying interpolation matrix Φ[k], and a few optimal sampling strategies have already been proposed in this context [14,19], even covering time-varying probabilistic sampling [16,18]- [19].…”
Section: Graph Signal Processingmentioning
confidence: 99%
“…In addition, it is noticeable that the proposed NLMS update equation in (30) resembles the RLS long-term expression in (16), thus indicating that the inclusion of M brings about some RLS-like features to the resulting algorithm. Particularly, when the covariance matrix is given by C w = σ 2 w I, with σ 2 w > 0, and S[k] = S, both algorithms present an equivalent performance for large k if the NLMS convergence factor µ N and the RLS forgetting factor β R follow the relation…”
Section: Steady-state Fom Analysismentioning
confidence: 99%
“…Particularly, for graph signal reconstruction [5], [16] the LMS algorithm is designed to minimize the mean-square sampled deviation, i.e., min.…”
Section: A Lms Estimationmentioning
confidence: 99%
“…By using a stochastic gradient approach for the minimization problem in (11), one finds an update expression for s F [k + 1]. Moreover, using the IGFT in (2), one can easily find an estimatex o [k] for the bandlimited graph signal x o [k] in (3), which corresponds to the LMS update equation [16] x…”
Section: A Lms Estimationmentioning
confidence: 99%