1998
DOI: 10.1162/089976698300017386
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Learning in Boltzmann Machines Using Linear Response Theory

Abstract: El acceso a la versión del editor puede requerir la suscripción del recurso Access to the published version may require subscription AbstractThe learning process in Boltzmann Machines is computationally very expensive. The computational complexity of the exact algorithm is exponential in the number of neurons. We present a new approximate learning algorithm for Boltzmann Machines, which is based on mean eld theory and the linear response theorem. The computational complexity of the algorithm is cubic in the n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
247
0

Year Published

2000
2000
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 170 publications
(248 citation statements)
references
References 22 publications
1
247
0
Order By: Relevance
“…Jaakkola and Jordan [121] explored the use of mixture distributions in improving the mean field approximation. A large class of techniques, including linear response theory and the TAP method [e.g., 127,155,182,195,255,267], seek to improve the mean field approximation by introducing higher-order correction terms. Typically, the lower bound on the log partition function is not preserved by these higher-order methods.…”
Section: Example 56 (Structured Mf For Factorial Hmms)mentioning
confidence: 99%
“…Jaakkola and Jordan [121] explored the use of mixture distributions in improving the mean field approximation. A large class of techniques, including linear response theory and the TAP method [e.g., 127,155,182,195,255,267], seek to improve the mean field approximation by introducing higher-order correction terms. Typically, the lower bound on the log partition function is not preserved by these higher-order methods.…”
Section: Example 56 (Structured Mf For Factorial Hmms)mentioning
confidence: 99%
“…Good convergence is observed in accordance with the theoretical expectations. Introduction.-Inferring interactions between the elements of a network can be posed as an inverse problem in statistical physics in terms of either equilibrium models [1][2][3] or nonequilibrium ones. The latter has recently gained a lot of attention because of the wider generality and relevance to systems where one has data on the system over time [4,5].…”
mentioning
confidence: 99%
“…A different type of mean-field approximation is the Bethe approximation which reduces to the TAP approximation at lowest order and which consists in to assume that relevant coupling J ij have locally a tree-like structure. The Bethe approximation [16] can then be used and leads to approaches referred to as pseudo-moment matching methods [9,14,10,15]. This basically leads to two, possibly different mean-field solutions to the IIP: the direct one, by using the relation valid on a tree between the joint probability and the single and pairwise marginal distributions; the indirect one also called susceptibility propagation [10] relying in fact on the relation between the inverse susceptibility matrix and the set of susceptibility coefficients attached to the links of the tree [11].…”
Section: The Inverse Problem and An Heuristic Approximate Solutionmentioning
confidence: 99%