2021
DOI: 10.1109/tsp.2020.3041089
|View full text |Cite
|
Sign up to set email alerts
|

CPGD: Cadzow Plug-and-Play Gradient Descent for Generalised FRI

Abstract: Finite rate of innovation (FRI) is a powerful reconstruction framework enabling the recovery of sparse Dirac streams from uniform low-pass filtered samples. An extension of this framework, called generalised FRI (genFRI), has been recently proposed for handling cases with arbitrary linear measurement models. In this context, signal reconstruction amounts to solving a joint constrained optimisation problem, yielding estimates of both the Fourier series coefficients of the Dirac stream and its so-called annihila… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 25 publications
(5 citation statements)
references
References 42 publications
0
5
0
Order By: Relevance
“…where x ∈ Z is the largest integer smaller or equal to x. To show this, it suffices to apply (D + αId) on both sides of (53). Similar formulas can be obtained for any N ≥ 1.…”
Section: Polynomial Splinesmentioning
confidence: 94%
See 2 more Smart Citations
“…where x ∈ Z is the largest integer smaller or equal to x. To show this, it suffices to apply (D + αId) on both sides of (53). Similar formulas can be obtained for any N ≥ 1.…”
Section: Polynomial Splinesmentioning
confidence: 94%
“…Optimization in periodic function spaces. Several works for Dirac recovery have been developed over the torus, and are therefore tailored for periodic Dirac streams [18,20,23,24,26,32,53]. Contrarily to the non periodic setting, the extension to arbitrary periodic functions has only received a limited attention so far, mostly in the works of Simeoni [43,44].…”
Section: Comparison With Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The objective is to find the parameters that minimize the empirical risk. To do so, the Fast Gradient Method (FGM) (Florea & Vorobyov, 2020) and the Projection Gradient Method (PGD) (Simeoni et al, 2021) are selected as the adversarial training methods:…”
Section: How To Train Text With Dynamic Interferencementioning
confidence: 99%
“…Higher the σ, more is the jitter. We consider reconstruction using GenFRI-TEM (Algorithm 1), FRI-TEM [24] and Cadzow Plug-and-Play Gradient Descent (CPGD) [36] techniques. The…”
Section: B Recovery From Noisy Measurementsmentioning
confidence: 99%