2017
DOI: 10.3390/e19120667
|View full text |Cite
|
Sign up to set email alerts
|

L1-Minimization Algorithm for Bayesian Online Compressed Sensing

Abstract: Abstract:In this work, we propose a Bayesian online reconstruction algorithm for sparse signals based on Compressed Sensing and inspired by L1-regularization schemes. A previous work has introduced a mean-field approximation for the Bayesian online algorithm and has shown that it is possible to saturate the offline performance in the presence of Gaussian measurement noise when the signal generating distribution is known. Here, we build on these results and show that reconstruction is possible even if prior kno… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 34 publications
(52 reference statements)
0
6
0
Order By: Relevance
“…L1-minimization Algorithm for Bayesian Online Compressed Sensing [ 3 ] proposed a Bayesian online reconstruction algorithm for sparse signals based on Compressed Sensing and inspired by L1-regularization schemes. It showed that, even when prior knowledge about the signal’s generation is limited, reconstruction is possible by introduction of a Laplace prior and of an extra Kullback–Leibler divergence minimization step for hyper-parameter learning.…”
Section: Entropy Special Issue and Conference Proceedingsmentioning
confidence: 99%
“…L1-minimization Algorithm for Bayesian Online Compressed Sensing [ 3 ] proposed a Bayesian online reconstruction algorithm for sparse signals based on Compressed Sensing and inspired by L1-regularization schemes. It showed that, even when prior knowledge about the signal’s generation is limited, reconstruction is possible by introduction of a Laplace prior and of an extra Kullback–Leibler divergence minimization step for hyper-parameter learning.…”
Section: Entropy Special Issue and Conference Proceedingsmentioning
confidence: 99%
“…The second one is known as relaxation which replaces the zero-norm function with a smoother one and uses optimization technique to find the solution [10]. Algorithms such as orthogonal matching pursuit (OMP), smoothed l 0 (SL 0 ) and l 1 -minimization [11]- [13] are some examples of the proposed algorithms for finding the sparse solution.…”
Section: Introductionmentioning
confidence: 99%
“…They achieve better performance than the method in [10] with the same number of nonzero taps. Furthermore, for the pursuit of both sparse promotion and improved accuracy, the sparse signals reconstruction algorithms inspired by l 1 and weighted l 1 regularization schemes are proposed in [17,18]; the joint smoothed l 0 norm algorithm for direction-of-arrival estimation in MIMO radar is proposed in [19]. …”
Section: Introductionmentioning
confidence: 99%