2013
DOI: 10.1109/tit.2013.2238605
|View full text |Cite
|
Sign up to set email alerts
|

Support Recovery of Sparse Signals in the Presence of Multiple Measurement Vectors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
29
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 45 publications
(29 citation statements)
references
References 65 publications
0
29
0
Order By: Relevance
“…In other words, DCS cannot accurately characterize the relationship between L and the performances of solvers such as SOMP. [9,10,11] focus on the performance analysis based on Eq. (2) and show that the performance is proportional to rank(Y ) with noiseless measurements.…”
Section: Background and Related Workmentioning
confidence: 99%
“…In other words, DCS cannot accurately characterize the relationship between L and the performances of solvers such as SOMP. [9,10,11] focus on the performance analysis based on Eq. (2) and show that the performance is proportional to rank(Y ) with noiseless measurements.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Common choices of mixed-norms are the 2,1 norm [20], [37] and the ∞,1 norm [21], [22]. Similar to the SMV case, recovery guarantees for the MMV-based joint SSR problem have been derived [26]- [28], providing conditions for the noiseless case under which the sparse signal matrix X can be perfectly reconstructed. Moreover, it has been shown that rank-awareness in the signal reconstruction can additionally improve the reconstruction performance [29].…”
Section: Sparse Representation and Mixed-norm Minimizationmentioning
confidence: 99%
“…Similar to the SMV case, heuristics for the MMVbased SSR problem include convex relaxation by means of mixed-norm minimization [20]- [23], and greedy methods [24], [25]. Recovery guarantees for the MMV case have been established in [26]- [28], and it has been shown that rank awareness in MMV-based SSR can further enhance the recovery performance as compared to the SMV case [29]. An extension to the infinite-dimensional vector space for MMV-based SSR, using atomic norm minimization, has been proposed in [30]- [32].…”
Section: Introductionmentioning
confidence: 99%
“…In particular, [22] showed a lower bound on sample complexity of support recovery of roughly (k/m), much weaker than our (k/m) 2 lower bound. Another related line of works [27], [17] studies this problem considering the same measurement matrix for all samples, under the assumption that the data vectors are deterministic. In [17], the authors connect the support recovery problem to communication over a single input multiple output MAC channel.…”
Section: Introductionmentioning
confidence: 99%
“…Another related line of works [27], [17] studies this problem considering the same measurement matrix for all samples, under the assumption that the data vectors are deterministic. In [17], the authors connect the support recovery problem to communication over a single input multiple output MAC channel. However, the performance guarantees are asymptotic in nature (d → ∞ and k, n fixed).…”
Section: Introductionmentioning
confidence: 99%