ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9413451
|View full text |Cite
|
Sign up to set email alerts
|

Sparsity Driven Latent Space Sampling for Generative Prior Based Compressive Sensing

Abstract: We address the problem of recovering signals from compressed measurements based on generative priors. Recently, generative-model based compressive sensing (GMCS) methods have shown superior performance over traditional compressive sensing (CS) techniques in recovering signals from fewer measurements. However, it is possible to further improve the performance of GMCS by introducing controlled sparsity in the latent-space. We propose a proximal metalearning (PML) algorithm to enforce sparsity in the latentspace … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…For example, Dhar et al 43 proposed a framework to allow for a sparse deviation from the range set of the generator. Killedar et al 44 proposed to induce sparsity of the latent space when training the generator. In our case, the sparse latent representation of cracks in an image is automatically achieved in the existing generators taken from the literatures.…”
Section: Robust Reconstruction By ' 1 Norm Regularizationmentioning
confidence: 99%
“…For example, Dhar et al 43 proposed a framework to allow for a sparse deviation from the range set of the generator. Killedar et al 44 proposed to induce sparsity of the latent space when training the generator. In our case, the sparse latent representation of cracks in an image is automatically achieved in the existing generators taken from the literatures.…”
Section: Robust Reconstruction By ' 1 Norm Regularizationmentioning
confidence: 99%
“…The sparsity constraint in the latent representation z results in the range-space of the generator network G θ Q to be composed of a union-of-manifolds [15,16]. The optimization of z involves minimizing L G while satisfying the feasibility condition z 0 ≤ s. This approach is referred to as sparsity-driven latent-space sampling (SDLSS).…”
Section: Quantized Generative Models As An Imagementioning
confidence: 99%
“…Generative models for solving inverse problems: Generative models have been used to solve inverse problems [2,4,6,[12][13][14][15][16]33] and are shown to outperform the traditional optimization methods using a small number of measurements. Bora et al enforced the constraint that the signal x lies in the range-space of the pre-trained generator network G θ [4].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation