2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2018
DOI: 10.1109/icassp.2018.8462233
|View full text |Cite
|
Sign up to set email alerts
|

Solving Linear Inverse Problems Using Gan Priors: An Algorithm with Provable Guarantees

Abstract: In recent works, both sparsity-based methods as well as learningbased methods have proven to be successful in solving several challenging linear inverse problems. However, sparsity priors for natural signals and images suffer from poor discriminative capability, while learning-based methods seldom provide concrete theoretical guarantees. In this work, we advocate the idea of replacing hand-crafted priors, such as sparsity, with a Generative Adversarial Network (GAN) to solve linear inverse problems such as com… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
173
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 119 publications
(175 citation statements)
references
References 34 publications
2
173
0
Order By: Relevance
“…A variety of follow-up works of [14] provided additional theoretical guarantees for compressive sensing with generative models. For example, instead of using a gradient descent algorithm as in [14], the authors of [16], [17] provide recovery guarantees for a projective gradient descent algorithm under various assumptions, where now the gradient steps are taken in the ambient space (i.e., on output vectors from the generative model)…”
Section: A Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A variety of follow-up works of [14] provided additional theoretical guarantees for compressive sensing with generative models. For example, instead of using a gradient descent algorithm as in [14], the authors of [16], [17] provide recovery guarantees for a projective gradient descent algorithm under various assumptions, where now the gradient steps are taken in the ambient space (i.e., on output vectors from the generative model)…”
Section: A Related Workmentioning
confidence: 99%
“…In practice, such a minimization problem may be hard, so it was proposed to use gradient descent in the latent space (i.e., on input vectors to the generative model).A variety of follow-up works of [14] provided additional theoretical guarantees for compressive sensing with generative models. For example, instead of using a gradient descent algorithm as in [14], the authors of [16], [17] provide recovery guarantees for a projective gradient descent algorithm under various assumptions, where now the gradient steps are taken in the ambient space (i.e., on output vectors from the generative model)In [14], the recovered signal is assumed to lie in (or close to) the range of the generative model G(·), which poses limitations for cases that the true signal is further from the range of G(·). To overcome this problem, a more general model allowing sparse deviations from the range of G(·) was introduced and analyzed in [18].In the case that the generative model is a ReLU network, under certain assumptions on the layer sizes and randomness assumptions on the weights, the authors of [19] show that the non-convex objective function given by empirical risk minimization does not have any spurious stationary points, and accordingly establish a simple gradient-based algorithm that is guaranteed to find the global minimum.…”
mentioning
confidence: 99%
“…We fixed the learning rate for projected gradient descent at 5e − 3. Note that, PGD has been successfully used in several applications involving GANs [8,13,52,57]. The optimal projection is given by G(z * ), where z * = arg min Here, each image is from a different run with the specified corruption setting, and we see that MimicGAN provides significantly better reprojection error in all cases above.…”
Section: Experiments Setupmentioning
confidence: 75%
“…More specifically, this process involves sampling from the latent space of the GAN, in order to find an image that resembles the seed image. Though this has been conventionally carried out using projected gradient descent (PGD) [5,6], as we demonstrate in our results, this approach performs poorly when the initial estimate is too noisy or has too many artifacts, which is common under extremely limited angle scenarios.…”
Section: Introductionmentioning
confidence: 88%
“…and finally, X * = G(z * ) and can be solved using stochastic gradient descent. This has been referred to as a GAN prior [6] or a manifold prior [3] for inverse imaging. However, solving the equation of the form in (1) is not always possible since one may not have access to the measurement model A.…”
Section: Proposed Approachmentioning
confidence: 99%