2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton) 2013
DOI: 10.1109/allerton.2013.6736635
|View full text |Cite
|
Sign up to set email alerts
|

The squared-error of generalized LASSO: A precise analysis

Abstract: We consider the problem of estimating an unknown but structured signal x0 from its noisy linear observations y = Ax0 + z ∈ R m . To the structure of x0 is associated a structure inducing convex function f (·). We assume that the entries of A are i.i.d. standard normal N (0, 1) and z ∼ N (0, σ 2 Im). As a measure of performance of an estimate x * of x0 we consider the "Normalized Square Error" (NSE) x * − x0 2 2 /σ 2 . For sufficiently small σ, we characterize the exact performance of two different versions of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
119
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
4
1

Relationship

2
7

Authors

Journals

citations
Cited by 80 publications
(123 citation statements)
references
References 72 publications
(234 reference statements)
4
119
0
Order By: Relevance
“…On the other hand, when there is noise, as the restricted eigenvalue min a∈D 1 (x0) Aa 2 gets larger it is known that the recovery is robuster and one suffers from less error, [4], [12], [14]. Applying Theorem 2 on the set D 1 (x 0 ) ensures that, restricted eigenvalues of URS's are as good as i.i.d.…”
Section: Ieee International Symposium On Information Theorymentioning
confidence: 97%
“…On the other hand, when there is noise, as the restricted eigenvalue min a∈D 1 (x0) Aa 2 gets larger it is known that the recovery is robuster and one suffers from less error, [4], [12], [14]. Applying Theorem 2 on the set D 1 (x 0 ) ensures that, restricted eigenvalues of URS's are as good as i.i.d.…”
Section: Ieee International Symposium On Information Theorymentioning
confidence: 97%
“…This showed that previous results were tight. A line of work by Thrampoulidis, Oymak, and Hassibi [33,34,42] concentrated on the precise reconstruction error from noisy observations, and also considered unconstrained versions of the K-Lasso. Our theoretical results in the non-linear case can be seen to mirror Theorem [33, Theorem 1] in the linear case.…”
Section: Related Literaturementioning
confidence: 99%
“…Predating, but especially following, the works in compressed sensing, there have also been several works which tackle the general case, giving results for arbitrary T [11,25,17,16,1,3,20,21]. The deviation inequalities of this paper allow for a general treatment as well.…”
Section: 6mentioning
confidence: 99%