2012
DOI: 10.1109/tsp.2012.2192436
|View full text |Cite
|
Sign up to set email alerts
|

Boltzmann Machine and Mean-Field Approximation for Structured Sparse Decompositions

Abstract: To cite this version:Angélique Drémeau, Cédric Herzet, Laurent Daudet. Boltzmann machine and mean-field approximation for structured sparse decompositions. Accepté à IEEE Trans. On Signal Processing. 2012. AbstractTaking advantage of the structures inherent in many sparse decompositions constitutes a promising research axis. In this paper, we address this problem from a Bayesian point of view. We exploit a Boltzmann machine, allowing to take a large variety of structures into account, and foc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
73
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 70 publications
(73 citation statements)
references
References 57 publications
0
73
0
Order By: Relevance
“…This model recently appeared in dictionary based processing setups. Dremeau et al [15] show that it generalizes many structured sparsity models. Under this model, we can evaluate the probability of a state using the difference of energy for atom i:…”
Section: Iii-a Structured Sparsity Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…This model recently appeared in dictionary based processing setups. Dremeau et al [15] show that it generalizes many structured sparsity models. Under this model, we can evaluate the probability of a state using the difference of energy for atom i:…”
Section: Iii-a Structured Sparsity Modelmentioning
confidence: 99%
“…Future work will investigate smarter optimization scheme, such as Bayesian versions of MP (e.g. as in [15] with Boltzmann machines) or convex relaxations methods.…”
Section: Iv-b Robustness and Recognition Performancesmentioning
confidence: 99%
“…Subsequently, in the reconstruction of a new signal of the same type, the generative model can serve as a Bayesian prior. In particular, the idea to exploit RBMs in CS applications was pioneered by [DHD12] and [TDK16], who trained binary RBMs using Contrastive Divergence (to locate the support of the non-zeros entries of sparse signals) and combined it with an AMP reconstruction. They demonstrated drastic improvements in the reconstruction with structured learned priors compared to the usual sparse unstructured priors.…”
Section: Structured Bayesian Priorsmentioning
confidence: 99%
“…• The atom selection step does not take into account the new update step (13). This means that there is no guarantee that the selected atom will lead to a steepest descent of the residual error, as for the OMP.…”
Section: Analysis Of Campmentioning
confidence: 99%
“…"Spike-andslabs" prior models have been introduced [9][10][11], but usually assume that the coefficients are independent and identically distributed, so they do not make use of the statistical dependencies between coefficients. Graphical models have been proposed to model theses dependencies [12][13][14]. However this approach involves the design of specialised, and often computationally expensive optimization methods.…”
Section: Introductionmentioning
confidence: 99%