2012
DOI: 10.1109/tit.2012.2189196
|View full text |Cite
|
Sign up to set email alerts
|

Subspace Methods for Joint Sparse Recovery

Abstract: We propose robust and efficient algorithms for the joint sparse recovery problem in compressed sensing, which simultaneously recover the supports of jointly sparse signals from their multiple measurement vectors obtained through a common sensing matrix.In a favorable situation, the unknown matrix, which consists of the jointly sparse signals, has linearly independent nonzero rows.In this case, the MUSIC (MUltiple SIgnal Classification) algorithm, originally proposed by Schmidt for the direction of arrival prob… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
210
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 203 publications
(211 citation statements)
references
References 80 publications
1
210
0
Order By: Relevance
“…The elements of a sensing matrix A were generated from a Gaussian distribution having zero mean and variance of 1/m, and then each column of A was normalized to have an unit norm. An unknown signal X with rank(X) = r ≤ k was generated using the same procedure as in [12]. Specifically, we randomly generated a support I, and then the corresponding nonzero signal components were obtained by…”
Section: A Dependency On Snapshot Numbermentioning
confidence: 99%
See 4 more Smart Citations
“…The elements of a sensing matrix A were generated from a Gaussian distribution having zero mean and variance of 1/m, and then each column of A was normalized to have an unit norm. An unknown signal X with rank(X) = r ≤ k was generated using the same procedure as in [12]. Specifically, we randomly generated a support I, and then the corresponding nonzero signal components were obtained by…”
Section: A Dependency On Snapshot Numbermentioning
confidence: 99%
“…In order to identify the contribution of the forward and backward greedy steps in the performance improvement, we perform additional experiments using the same simulation setup. estimated using an identical subspace S-OMP algorithm in [7], [12] so that performance differences came only from the sequential subspace estimation step. For a bigger N where the signal subspace error is small, performance improvement due to the sequential subspace estimation was not remarkable.…”
Section: A Dependency On Snapshot Numbermentioning
confidence: 99%
See 3 more Smart Citations