2011
DOI: 10.1162/neco_a_00158
|View full text |Cite
|
Sign up to set email alerts
|

Quickly Generating Representative Samples from an RBM-Derived Process

Abstract: Two recently proposed learning algorithms, herding and fast persistent contrastive divergence (FPCD), share the following interesting characteristic: they exploit changes in the model parameters while sampling in order to escape modes and mix better during the sampling process that is part of the learning algorithm. We justify such approaches as ways to escape modes while keeping approximately the same asymptotic distribution of the Markov chain. In that spirit, we extend FPCD using an idea borrowed from Herdi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
46
0

Year Published

2011
2011
2018
2018

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 59 publications
(47 citation statements)
references
References 7 publications
1
46
0
Order By: Relevance
“…The main application of this network is to recognize the best values for initializing the learning procedure to fine‐tune the weights and biases of ANN. The commonly used method in constructing each layer of DBN is restricted Boltzmann machine (RBM) …”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations
“…The main application of this network is to recognize the best values for initializing the learning procedure to fine‐tune the weights and biases of ANN. The commonly used method in constructing each layer of DBN is restricted Boltzmann machine (RBM) …”
Section: Introductionmentioning
confidence: 99%
“…This iterative approach is a randomizing algorithm and guarantees obtaining the best results from the model. It is obvious that running this algorithm for many steps is too time consuming to be practical . Hinton et al (2006) introduced a fast greedy algorithm that quickly produces a fairly good set of parameters, even in deep networks with millions of parameters and many hidden layers using contrastive divergence (CD) method .…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…They demonstrate that the Boltzmann machine implemented using the ANCAN generated a wide variety of images compared to the other Boltzmann machines. Following previous works on generative models [29][30][31], we evaluated the generation abilities of the Boltzmann machines by the log-likelihood of the test subset. The probability of the test subset was estimated by fitting a Gaussian Parzen window to the samples from the Boltzmann machines.…”
Section: Classification and Generation Of The Mnist Databasementioning
confidence: 99%