2017
DOI: 10.3389/fncom.2016.00144
|View full text |Cite
|
Sign up to set email alerts
|

On the Maximum Storage Capacity of the Hopfield Model

Abstract: Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfield neural network, it has been shown in the literature that the retrieval errors diverge when the number of stored me… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
36
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 36 publications
(37 citation statements)
references
References 20 publications
1
36
0
Order By: Relevance
“…The authors in [19] give an analytical expression for the number of retrieval errors, and, when it restricts itself to the region P < N, it retrieves the bound P < α c N, which is analytically predicted in [11,12]. On the contrary, when it allows autapses, which implies that the diagonal elements in the connectivity matrix J may be different from zero, it finds that a new region exists, with P N, where the number of retrieval errors decreases on increasing P. In this new region, the number of retrieval errors reaches values lower than one, thus it finds new favorable conditions for an effective and efficient storage of memory patterns in an RNN.…”
Section: Introductionmentioning
confidence: 99%
“…The authors in [19] give an analytical expression for the number of retrieval errors, and, when it restricts itself to the region P < N, it retrieves the bound P < α c N, which is analytically predicted in [11,12]. On the contrary, when it allows autapses, which implies that the diagonal elements in the connectivity matrix J may be different from zero, it finds that a new region exists, with P N, where the number of retrieval errors decreases on increasing P. In this new region, the number of retrieval errors reaches values lower than one, thus it finds new favorable conditions for an effective and efficient storage of memory patterns in an RNN.…”
Section: Introductionmentioning
confidence: 99%
“…Given any initial neural state, or input, a discrete-time recurrent neural network dynamically falls into an attractor. In this framework, the attractor is the retrieved limit behavior (Hebb, 1949;Amit et al, 1985b;McEliece et al, 1987;Folli et al, 2017;Gutfreund et al, 1988;Bastolla & Parisi, 1998;Sompolinsky et al, 1988;Wainrib & Touboul, 2013). Finally, it is important to consider that a recurrent network associates a limit behavior to each input from the set of all possible N -bit inputs, since the number of limit behaviors C is such that C << 2 N , it performs a many-to-few mapping.…”
Section: Introductionmentioning
confidence: 99%
“…Diagonal interaction terms were not considered in the early works about Hopfield model for a physical reason: in the corresponding spin models the field of a variable is induced by the state of its neighbours, but not on its own; thus self interactions do not exist and J ii = 0, ∀i. Neural networks with diagonal terms have been studied in [9] and a very interesting regime has been found for α 1. The probability p V that a given pattern ξ µ is a not fixed point of the dynamics has been computed and has been shown to be very small for very low α = P/N , as expected, but surprisingly another region has been identified at the very large α regime, where p V is also very small.…”
Section: Dynamics Of a Neural Networkmentioning
confidence: 99%
“…Even if this result seems to invalidate the usefulness of this new regime, it was shown that the ratiō p V /p V tends to a finite number, e, in the large N limit and that real patterns have an higher probability of being fixed points of the model. In the next section we address the stability question of these patterns, making use of the same strategy used in [9].…”
Section: Dynamics Of a Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation