2018
DOI: 10.1016/j.acha.2016.08.004
|View full text |Cite
|
Sign up to set email alerts
|

Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all

Abstract: Many inverse problems in signal processing deal with the robust estimation of unknown data from underdetermined linear observations. Low-dimensional models, when combined with appropriate regularizers, have been shown to be efficient at performing this task. Sparse models with the 1-norm or low rank models with the nuclear norm are examples of such successful combinations. Stable recovery guarantees in these settings have been established using a common tool adapted to each case: the notion of restricted isome… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
90
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 84 publications
(91 citation statements)
references
References 40 publications
1
90
0
Order By: Relevance
“…There have also been several theoretical extensions. First, generalizations of the RIP for the setting of asymptotic sparsity, asymptotic incoherence and multilevel random subsampling have been introduced and analysed in [9,60,80]. These complement the results proved in this paper by establishing uniform recovery guarantees.…”
Section: Relation To Other Workmentioning
confidence: 50%
See 2 more Smart Citations
“…There have also been several theoretical extensions. First, generalizations of the RIP for the setting of asymptotic sparsity, asymptotic incoherence and multilevel random subsampling have been introduced and analysed in [9,60,80]. These complement the results proved in this paper by establishing uniform recovery guarantees.…”
Section: Relation To Other Workmentioning
confidence: 50%
“…(Structured sparsity, especially multiscale-type sparsity, also predates CS by some yearssee, for example, the work of Donoho and Huo [33]-and finds use outside of CS-see, for example, the work of Donoho and Kutyniok on geometric separation [34].) These include group, block, weighted and tree sparsity, amongst others (see [8,11,38,72,80] and references therein). In most of these works, structured sparsity is exploited by the design of the recovery algorithm (for example, by replacing the thresholding step in an iterative algorithm or the regularization functional in an optimization approach), with the sensing being carried out by a standard, incoherent operator (for example, a Gaussian random matrix).…”
Section: Relation To Other Workmentioning
confidence: 99%
See 1 more Smart Citation
“…the n−1-dimensional Haussdorf measure of T R (Σ)∩S (1)). When maximizing this compliance measure with convex regularizers (proper, coercive and continuous), it has been shown that we can limit ourselves to atomic norms with atoms included in the model [11,Lemma 2.1]. When looking at non-uniform recovery for random Gaussian measurements, the quantity vol(T R (x 0 )∩S(1)) vol(S(1)) represents the probability that a randomly oriented kernel of dimension 1 intersects (non trivially) T R (x 0 ).…”
Section: Definition 11 (Descent Vectors)mentioning
confidence: 99%
“…In [11], an explicit constant δ suf f Σ (R) is given, such that δ(M ) < δ suf f Σ (R) guarantees exact recovery of elements of Σ by minimization (2). This constant is only sufficient (and sharp in some sense for sparse and low rank recovery).…”
Section: Compliance Measures Based On the Ripmentioning
confidence: 99%