2013
DOI: 10.48550/arxiv.1309.5749
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adaptive Variable Step Algorithm for Missing Samples Recovery in Sparse Signals

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2014
2014
2015
2015

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(8 citation statements)
references
References 0 publications
0
8
0
Order By: Relevance
“…This minimization problem, under the conditions defined within the restricted isometry property (RIP), [3], [4], can produce the same result as (2). Note that other norms ℓ p between the ℓ 0 -norm and the ℓ 1 -norm, with values 0 < p < 1, are also used in the minimization in attempts to combine good properties of these two norms [1,26,35].…”
Section: Gradient-based Reconstructionmentioning
confidence: 99%
See 3 more Smart Citations
“…This minimization problem, under the conditions defined within the restricted isometry property (RIP), [3], [4], can produce the same result as (2). Note that other norms ℓ p between the ℓ 0 -norm and the ℓ 1 -norm, with values 0 < p < 1, are also used in the minimization in attempts to combine good properties of these two norms [1,26,35].…”
Section: Gradient-based Reconstructionmentioning
confidence: 99%
“…Some signals that cover whole considered interval in one domain could be sparse in a transformation domain. Compressive sensing theory, in general, deals with a lower dimensional set of linear observations of a sparse signal, in order to recover all signal values [1]- [26]. This area intensively develops in the last decade.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…A large number of algorithms is based on approximate solutions based on thresholding or greedy methods [8]- [14]. In this paper, we have analyzed the performance of the iterative algorithm for reconstruction of sparse signals, based on gradient calculation and signal concentration as a measure of sparsity [6], [7]. When the iterations approach the optimal point, gradient value oscillates around the true value.…”
Section: Introductionmentioning
confidence: 99%