2009 IEEE International Conference on Acoustics, Speech and Signal Processing 2009
DOI: 10.1109/icassp.2009.4960261
|View full text |Cite
|
Sign up to set email alerts
|

Strong thresholds for &#x2113;<inf>2</inf>/&#x2113;<inf>1</inf>-optimization in block-sparse compressed sensing

Abstract: It has been known for a while that l1-norm relaxation can in certain cases solve an under-determined system of linear equations. Recently, [5,10] proved (in a large dimensional and statistical context) that if the number of equations (measurements in the compressed sensing terminology) in the system is proportional to the length of the unknown vector then there is a sparsity (number of non-zero elements of the unknown vector) also proportional to the length of the unknown vector such that l1-norm relaxation su… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
38
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(38 citation statements)
references
References 29 publications
0
38
0
Order By: Relevance
“…3.13] Our applications focus on these four instances, but a dizzying variety other structure-promoting atomic gauges are available. For example, there are atomic gauges for vectors that are sparse in a dictionary (also known as analysis-sparsity) [34], [11], block-and group-sparse vectors [20], [72], and low-rank tensors and probability measures [15, Sec. 2].…”
Section: Structured Signals and Atomic Gaugesmentioning
confidence: 99%
“…3.13] Our applications focus on these four instances, but a dizzying variety other structure-promoting atomic gauges are available. For example, there are atomic gauges for vectors that are sparse in a dictionary (also known as analysis-sparsity) [34], [11], block-and group-sparse vectors [20], [72], and low-rank tensors and probability measures [15, Sec. 2].…”
Section: Structured Signals and Atomic Gaugesmentioning
confidence: 99%
“…Dynamic extensions of these ideas should prove to be useful in applications where the structure is present and changes slowly. Two common examples of structured sparsity are block sparsity [114], [115] and tree structured sparsity (for wavelet coefficients) [116]. A length m vector is block sparse if it can be partitioned into length k blocks such that a lot of the blocks are entirely zero.…”
Section: A Signal Models 1) Structured Sparsitymentioning
confidence: 99%
“…A length m vector is block sparse if it can be partitioned into length k blocks such that a lot of the blocks are entirely zero. One way to recover block sparse signals is by solving the ℓ 2ℓ 1 minimization problem [114], [115]. Block sparsity is valid for many applications, e.g., for the foreground image sequence of a video consisting of one or a few moving objects, or for the activation regions in brain fMRI.…”
Section: A Signal Models 1) Structured Sparsitymentioning
confidence: 99%
“…To exploit the block structure of ideally block-sparse signals, C 2 /C 1 optimization was proposed. The standard block sparse constraint (SBSC) in the form of C 2 /C 1 optimization can be formulated as [35] [36]:…”
Section: A Wideband Spectrum Sensing For Fixed Spectrum Allocationmentioning
confidence: 99%