2008
DOI: 10.1007/s00041-008-9035-z
|View full text |Cite
|
Sign up to set email alerts
|

Iterative Thresholding for Sparse Approximations

Abstract: ABSTRACT.Sparse signal expansions represent or approximate a signal using a small number of elements from a large collection of elementary waveforms. Finding the optimal sparse expansion is known to be NP hard in general and non-optimal strategies such as Matching Pursuit, Orthogonal Matching Pursuit, Basis Pursuit and Basis Pursuit De-noising are often called upon. These methods show good performance in practical situations, however, they do not operate on the ℓ0 penalised cost functions that are often at the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
985
0
1

Year Published

2010
2010
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 1,112 publications
(990 citation statements)
references
References 32 publications
4
985
0
1
Order By: Relevance
“…As mentioned in the introduction, these results offer a complementary view to the theoretical develoments of [11,12]. They also provide at the same time a very general convergence result which can be immediately generalized to compressive sensing problems involving semialgebraic or real-analytic nonlinear measurements.…”
Section: Examplesmentioning
confidence: 71%
See 3 more Smart Citations
“…As mentioned in the introduction, these results offer a complementary view to the theoretical develoments of [11,12]. They also provide at the same time a very general convergence result which can be immediately generalized to compressive sensing problems involving semialgebraic or real-analytic nonlinear measurements.…”
Section: Examplesmentioning
confidence: 71%
“…The convergence results the authors obtained involve different assumptions on the linear operator A: they either assume that A < 1 [11,Theorem 3] or that A satisfies the restricted isometry property [12,Theorem 4]. Our results show that convergence actually occurs for any linear map so long as the sequence (x k ) k∈N is bounded.…”
Section: Introductionmentioning
confidence: 86%
See 2 more Smart Citations
“…The sparse approximation step consists in estimating S with a fixed b A ( b A being the current estimate of A) and is implemented via an Iterative Hard Thresholding (IHT) [9] where each of the constraints of (4) are imposed at each iteration just after the gradient descent step. The dictionary update step consists in estimating A with a fixed b S ( b S being the estimate of S obtained from the previous step), and is obtained by multiplying the pseudo-inverse of b SΦ T to the right side of Y , and then setting the non-positive values to zero so as to impose the non-negativity constraint on A.…”
Section: Bss-iht Algorithmmentioning
confidence: 99%