2016
DOI: 10.1007/978-3-319-46128-1_48
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic CoSaMP: Randomizing Greedy Pursuit for Sparse Signal Recovery

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…23 Our approach will be in the spirit of random subdictionary selection. 24 There have been some approaches with subsampling of dictionaries over rows and columns in order to increase the speed of the convergence of greedy methods, 25,26 but using such subsampling methods for coherent dictionaries is still unexplored. Authors of these papers named one of these methods as StoCoSaMP (Stochastic CoSaMP).…”
Section: The Recovery Of Signal's Support In a Highly Coherent Dictiomentioning
confidence: 99%
“…23 Our approach will be in the spirit of random subdictionary selection. 24 There have been some approaches with subsampling of dictionaries over rows and columns in order to increase the speed of the convergence of greedy methods, 25,26 but using such subsampling methods for coherent dictionaries is still unexplored. Authors of these papers named one of these methods as StoCoSaMP (Stochastic CoSaMP).…”
Section: The Recovery Of Signal's Support In a Highly Coherent Dictiomentioning
confidence: 99%
“…where c 0 = {the number of indices k such that c k = 0} measures the sparsity of c. In (6), δ is a tolerance of solution inaccuracy due to the truncation of the expansion. While, the problem (6) is NP-hard to solve, approximate solutions may be obtained in polynomial time using a variety of greedy algorithms including orthogonal matching pursuit (OMP) [29,30,31,32], compressive sampling matching pursuit (CoSaMP) [33,34], and subspace pursuit (SP) [35], or convex relaxation via ℓ 1 -minimization [5,6]. The key advantage of an approximation via compressed sensing is that, if the QoI is approximately sparse, stable and convergent approximations of c can be obtained using N < P random samples of u(Ξ), as long as Φ satisfies certain conditions [5,6,10,36,21,27].…”
Section: Introductionmentioning
confidence: 99%
“…c where kck 0 = {the number of indices k such that c k = 6 0} indicates the sparsity c. In (12), δ is a tolerance of solution inaccuracy due to the truncation of the expansion. While, the problem (12) is NP-hard to solve, approximate solutions may be obtained in polynomial time using a variety of greedy algorithms including orthogonal matching pursuit (OMP) (Tropp and Gilbert 2005;Tropp and Gilbert 2007;Needell and Vershynin 2010;Davenport and Wakin 2010), compressive sampling matching pursuit (CoSaMP) (Needell and Tropp 2009;Pal and Mengshoel 2016), and subspace pursuit (SP) (Dai and Milenkovic 2009). A convex relaxation of (12) can also be solved via ` 1 -minimization (Candès and Wakin 2008;Donoho 2006).…”
mentioning
confidence: 99%