We consider a three-layer restricted Boltzmann machine, where the two visible layers (encoding for input and output, respectively) are made of binary neurons while the hidden layer is made of Gaussian neurons, and we show a formal equivalence with a Hopfield model. The machine architecture allows for different learning and operational modes: when all neurons are free to evolve we recover a standard Hopfield model whose size corresponds to the overall size of visible neurons; when input neurons are clamped we recover a Hopfield model, whose size corresponds to the size of the output layer, endowed with an external field as well as additional slow noise. The former stems from the signal provided by the input layer and tends to favour retrieval, the latter can be related to the statistical properties of the training set and tends to impair the retrieval performance of the network. We address this model by rigorous techniques, finding an explicit expression for its free-energy, whence a phase-diagram showing the performance of the system as parameters are tuned.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.