2021
DOI: 10.1007/s10955-021-02841-y
|View full text |Cite
|
Sign up to set email alerts
|

Learning and Retrieval Operational Modes for Three-Layer Restricted Boltzmann Machines

Abstract: We consider a three-layer restricted Boltzmann machine, where the two visible layers (encoding for input and output, respectively) are made of binary neurons while the hidden layer is made of Gaussian neurons, and we show a formal equivalence with a Hopfield model. The machine architecture allows for different learning and operational modes: when all neurons are free to evolve we recover a standard Hopfield model whose size corresponds to the overall size of visible neurons; when input neurons are clamped we r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…For example, the Hopfield model has been shown to be equivalent to a restricted Boltzmann machine [21], an archetypical model for machine learning [22], and sparse restricted Boltzmann machines have been mapped to Hopfield models with diluted patterns [23,24]. Furthermore, restricted Boltzmann machines with generic priors have led to the definition of generalized Hopfield models [25][26][27] and neural networks with multi-node Hebbian interactions have recently been shown to be equivalent to higher-order Boltzmann machines [28,29] and deep Boltzmann machines [30,31]. As a result, multi-node Hebbian learning is receiving a second wave of interest since its foundation in the eighties [13,14] as a paradigm to understand deep learning [17,32].…”
Section: Introductionmentioning
confidence: 99%
“…For example, the Hopfield model has been shown to be equivalent to a restricted Boltzmann machine [21], an archetypical model for machine learning [22], and sparse restricted Boltzmann machines have been mapped to Hopfield models with diluted patterns [23,24]. Furthermore, restricted Boltzmann machines with generic priors have led to the definition of generalized Hopfield models [25][26][27] and neural networks with multi-node Hebbian interactions have recently been shown to be equivalent to higher-order Boltzmann machines [28,29] and deep Boltzmann machines [30,31]. As a result, multi-node Hebbian learning is receiving a second wave of interest since its foundation in the eighties [13,14] as a paradigm to understand deep learning [17,32].…”
Section: Introductionmentioning
confidence: 99%
“…Also motivated by the impressive successes obtained in artificial intelligence via deep learning methods (which, beyond DBMs, include a number of other different neural networks), this kind of structures have recently attracted a wide interest (see e.g. [12][13][14][15][16][17][18][19]). Conceptually, training algorithms for Boltzmann machines as well as their deep versions are based on the minimization of a fixed objective function (e.g.…”
Section: Introductionmentioning
confidence: 99%