2016
DOI: 10.1016/j.sigpro.2016.02.008
|View full text |Cite
|
Sign up to set email alerts
|

Cooperative greedy pursuit strategies for sparse signal representation by partitioning

Abstract: Cooperative Greedy Pursuit Strategies are considered for approximating a signal partition subjected to a global constraint on sparsity. The approach aims at producing a high quality sparse approximation of the whole signal, using highly coherent redundant dictionaries. The cooperation takes place by ranking the partition units for their sequential stepwise approximation, and is realized by means of i)forward steps for the upgrading of an approximation and/or ii) backward steps for the corresponding downgrading… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
6

Relationship

3
3

Authors

Journals

citations
Cited by 16 publications
(29 citation statements)
references
References 32 publications
0
29
0
Order By: Relevance
“…This entails that, in addition to selecting the dictionary atoms for the approximation of each block, the blocks are ranked for their sequential stepwise approximation. As a consequence, the approach is optimized in the sense of minimizing, at each iteration step, the norm of the total residual error f − f K [25]. As will be illustrated in Sec.…”
Section: The Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This entails that, in addition to selecting the dictionary atoms for the approximation of each block, the blocks are ranked for their sequential stepwise approximation. As a consequence, the approach is optimized in the sense of minimizing, at each iteration step, the norm of the total residual error f − f K [25]. As will be illustrated in Sec.…”
Section: The Methodsmentioning
confidence: 99%
“…From a computational viewpoint the particularity of the sub-dictionaries D C and D S is that the inner product with all its elements can be evaluated via FFT. This possibility reduces the complexity of the numerical calculations when the partition unit N b is large [25,26]. Also, the inner products with the atoms of the dictionaries D P 2 and D P 3 can be effectively implemented, all at once, via a convolution operation.…”
Section: The Dictionarymentioning
confidence: 99%
“…The optimized way of downgrading an approximation in an HBW manner is termed HBW Backwards Optimized Orthogonal Matching Pursuit. 21 For large images, this approach is demanding in terms of storage. A method with less memory requirements, though not optimized, is termed HBW Backwards Self-Projected Matching Pursuit (HBW-BSPMP).…”
Section: Hbw Pruningmentioning
confidence: 99%
“…Therefore, the use of proper signal processing techniques enable to obtain a sparse approximation of the signal reducing the size of data to be stored. The sparse representation must accurately reproduce all the important information in the signal with a smaller dimension than the original one …”
Section: Introductionmentioning
confidence: 99%
“…The sparse representation must accurately reproduce all the important information in the signal with a smaller dimension than the original one. 7 Thus, lossy techniques are applied to achieve higher CRs, in which useless information is discarded. The most common techniques used to obtain a sparse representation are orthogonal transforms, such as Fourier transform, wavelet transform (WT), wavelet packet transform, and cosine transform, followed by a thresholding in order to reduce the signal dimension.…”
Section: Introductionmentioning
confidence: 99%