2014 IEEE International Symposium on Information Theory 2014
DOI: 10.1109/isit.2014.6875045
|View full text |Cite
|
Sign up to set email alerts
|

Information-theoretic bounds for adaptive sparse recovery

Abstract: Abstract-We derive an information-theoretic lower bound for sample complexity in sparse recovery problems where inputs can be chosen sequentially and adaptively. This lower bound is in terms of a simple mutual information expression and unifies many different linear and nonlinear observation models. Using this formula we derive bounds for adaptive compressive sensing (CS), group testing and 1-bit CS problems. We show that adaptivity cannot decrease sample complexity in group testing, 1-bit CS and CS with linea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
34
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(34 citation statements)
references
References 19 publications
0
34
0
Order By: Relevance
“…The idea of outer approximation is to generate a sequence of cutting planes to approximate the cost function via its subgradient and iteratively include these cutting planes as constraints in the original optimization problem. In particular, we initialize by solving the following optimization problem (14) where and are introduced auxiliary variables, and is an user specified upper bound that bounds the cost function over the feasible region. The constraint of the above optimization problem can be casted into matrix vector form as follows:…”
Section: Theorem Iv5 (Upper Bound On Of Greedy Heuristic Algorithm Fmentioning
confidence: 99%
See 1 more Smart Citation
“…The idea of outer approximation is to generate a sequence of cutting planes to approximate the cost function via its subgradient and iteratively include these cutting planes as constraints in the original optimization problem. In particular, we initialize by solving the following optimization problem (14) where and are introduced auxiliary variables, and is an user specified upper bound that bounds the cost function over the feasible region. The constraint of the above optimization problem can be casted into matrix vector form as follows:…”
Section: Theorem Iv5 (Upper Bound On Of Greedy Heuristic Algorithm Fmentioning
confidence: 99%
“…Although in the sem- inal work of [11], it was shown under fairly general assumptions that "adaptivity does not help much", i.e., sequential adaptive compressed sensing does not improve the order of the min-max bounds obtained by algorithms, these limitations are restricted to certain performance metrics. It has also been recognized (see, e.g., [12]- [14]) that adaptive compressed sensing offers several benefits with respect to other performance metrics, such as the reduction in the signal-to-noise ratio (SNR) to recover the signal. Moreover, larger performance gain can be achieved by adaptive compressed sensing if we aim at recovering a "family" of signals with known statistical prior information (incorporating statistical priors in compressed sensing has been considered in [15] for the non-sequential setting and in [16] for the sequential setting using Bayesian methods).…”
Section: Introductionmentioning
confidence: 99%
“…To use (25) and (27) in a way similar to that of (20) and (26), we can make similar assumption as above (i.e., zero correlation and constant l2 norm for each row of A), and derive an approximate analytical expression for (27):…”
Section: S(c)nr Represents Per-sample Signal (Plus Clutter) To Noise mentioning
confidence: 99%
“…As mentioned previously, informational analysis was performed based on simulated radar scenes X and echo measurements Y. We used the simulated scene and echo data to perform information-theoretic graphing of RX(D), using Equation (14), and trans-information ( ; | ) ub I X Y A using (20) and 1 ( ; | ) ub I X Y A using (25). We applied inequalities in (26) and (27) to determine minimal undersampling ratios for signal reconstruction given certain values of sparsity, TBRs, noise, and distortion D. These are explained one by one in the following sub-sections…”
Section: Simulation Of Sparse Scenes and Noisy Echo Datamentioning
confidence: 99%
See 1 more Smart Citation