2017
DOI: 10.1016/j.acha.2016.10.001
|View full text |Cite
|
Sign up to set email alerts
|

On the number of iterations for convergence of CoSaMP and Subspace Pursuit algorithms

Abstract: In compressive sensing, one important parameter that characterizes the various greedy recovery algorithms is the iteration bound which provides the maximum number of iterations by which the algorithm is guaranteed to converge. In this letter, we present a new iteration bound for CoSaMP by certain mathematical manipulations including formulation of appropriate sufficient conditions that ensure passage of a chosen support through the two selection stages of CoSaMP, "Augment" and "Update". Subsequently, we extend… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2019
2019
2025
2025

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 11 publications
0
10
0
Order By: Relevance
“…In Figure 2, we compared the reconstruction percentage of different step-sizes of the proposed method with different sparsities in different isometry constants. We set the step size set and the range of sparsity as s ∈ [1,5,10,15] and K ∈ [10 100], respectively. The isometry constant parameter set was δ K ∈ [0.1, 0.2, 0.3, 0.4, 0.5, 0.6].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…In Figure 2, we compared the reconstruction percentage of different step-sizes of the proposed method with different sparsities in different isometry constants. We set the step size set and the range of sparsity as s ∈ [1,5,10,15] and K ∈ [10 100], respectively. The isometry constant parameter set was δ K ∈ [0.1, 0.2, 0.3, 0.4, 0.5, 0.6].…”
Section: Discussionmentioning
confidence: 99%
“…The computational complexity of these algorithms is significantly lower than that of the convex optimization methods; however, they require more measurement of values for exact recovery and have poor reconstruction performance in a noisy environment. To date, subspace pursuit (SP) [14] and compressive sampling matching pursuit (CoSaMP) [15,16] algorithms have been proposed by incorporating a backtracking strategy. These algorithms offer strong theoretical guarantees and provide robustness to noise.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…(1) orthogonal matching pursuit (OMP) [17]-based OMP-like algorithms, such as compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP) [18], regularized orthogonal matching pursuit (ROMP) [19], generalized orthogonal matching pursuit (GOMP) [20], sparsity adaptive matching pursuit (SAMP) [21], stabilized orthogonal matching pursuit (SOMP) [22], perturbed block orthogonal matching pursuit (PBOMP) [23] and forward backward pursuit (FBP) [24]. A common feature of these algorithms is that in each iteration, a support set is determined to approximate the correct support set according to the correlation value between the measurement vector y and the columns of A.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Once the atom is selected, it will not be deleted until the end of the iteration. The other is a class of compressive sampling matching pursuit algorithm (CoSaMP) [24,25], the subspace tracking algorithm (SP) [26,27]. After selecting the matched atoms, they added a backtracking function to delete unstable atoms to better guarantee the quality of the reconstructed signal.…”
Section: Introductionmentioning
confidence: 99%