2019
DOI: 10.48550/arxiv.1908.05368
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Robust One-Bit Recovery via ReLU Generative Networks: Near-Optimal Statistical Rate and Global Landscape Analysis

Shuang Qiu,
Xiaohan Wei,
Zhuoran Yang

Abstract: We study the robust one-bit compressed sensing problem whose goal is to design an algorithm that faithfully recovers any sparse target vector θ 0 ∈ R d uniformly m quantized noisy measurements. Under the assumption that the measurements are sub-Gaussian random vectors, to recover any ksparse θ 0 (k ≪ d) uniformly up to an error ε with high probability, the best known computationally tractable algorithm requires 1 m ≥ O(k log d/ε 4 ) measurements. In this paper, we consider a new framework for the one-bit sensi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 24 publications
0
9
0
Order By: Relevance
“…Recently, in the wake of successes of deep learning , deep generative networks have gained popularity as a novel approach to encoding and enforcing priors. They have been successfully used as a prior for various statistical estimation problems such as compressed sensing [13], blind deconvolution [5], inpainting [61], and many more [54,60,51,59], etc.…”
Section: Recovery With a Generative Network Priormentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, in the wake of successes of deep learning , deep generative networks have gained popularity as a novel approach to encoding and enforcing priors. They have been successfully used as a prior for various statistical estimation problems such as compressed sensing [13], blind deconvolution [5], inpainting [61], and many more [54,60,51,59], etc.…”
Section: Recovery With a Generative Network Priormentioning
confidence: 99%
“…Parallel to these empirical successes, a recent line of works have investigated theoretical guarantees for various statistical estimation tasks with generative network priors. Following the work of [13], [29] have given global guarantees for compressed sensing, followed then by many others for various inverse problems [53,44,25,6,51]. In particular [27] have shown that m = Ω(k log n) number of measurements are sufficient to recover a signal from random phaseless observations, assuming that the signal is the output of a trained generative network with latent space of dimension k. Note that, contrary to the sparse phase retrieval problem, generative priors for phase retrieval allow optimal sample complexity, up to logarithmic factors, with respect to the intrinsic dimension of the signal.…”
Section: Recovery With a Generative Network Priormentioning
confidence: 99%
“…In addition, there has been follow-up research on variations of the CS-DGP problem described above. The WDC is a critical assumption which enables efficient recovery in the setting of Gaussian noise [10], as well as global landscape analysis in the settings of phaseless measurements [8], one-bit (sign) measurements [17], two-layer convolutional neural networks [15], and low-rank matrix recovery [5]. Moreover, there are currently no known theoretical results in this area-compressed sensing with generative priors-that avoid the WDC: hence, until now, logarithmic expansion was necessary to achieve any provable guarantees.…”
Section: Technical Contributionsmentioning
confidence: 99%
“…One-bit recovery with a neural network prior, introduced in [17], has the following formal statement. Let G : R k → R n be a neural network of the form G(x) = ReLU(W (d) (.…”
Section: D2 One-bit Recoverymentioning
confidence: 99%
See 1 more Smart Citation