2017
DOI: 10.1088/1751-8121/aa8fd7
|View full text |Cite
|
Sign up to set email alerts
|

High storage capacity in the Hopfield model with auto-interactions—stability analysis

Abstract: Recent studies point to the potential storage of a large number of patterns in the celebrated Hopfield associative memory model, well beyond the limits obtained previously. We investigate the properties of new fixed points to discover that they exhibit instabilities for small perturbations and are therefore of limited value as associative memories. Moreover, a large deviations approach also shows that errors introduced to the original patterns induce additional errors and increased corruption with respect to t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…On the contrary, when it allows autapses, which implies that the diagonal elements in the connectivity matrix J may be different from zero, it finds that a new region exists, with P N, where the number of retrieval errors decreases on increasing P. In this new region, the number of retrieval errors reaches values lower than one, thus it finds new favorable conditions for an effective and efficient storage of memory patterns in an RNN. Moreover, it shows that, in this region, the number of storable patterns grows exponentially with N. In response to these results, Rocchi et al [20] found that, in the thermodynamic limit at P N, the basin of attraction of the stationary states vanishes. Thus, asymptotically, the basing of attraction of the stationary state coincides with the stationary state itself.…”
Section: Introductionmentioning
confidence: 88%
See 1 more Smart Citation
“…On the contrary, when it allows autapses, which implies that the diagonal elements in the connectivity matrix J may be different from zero, it finds that a new region exists, with P N, where the number of retrieval errors decreases on increasing P. In this new region, the number of retrieval errors reaches values lower than one, thus it finds new favorable conditions for an effective and efficient storage of memory patterns in an RNN. Moreover, it shows that, in this region, the number of storable patterns grows exponentially with N. In response to these results, Rocchi et al [20] found that, in the thermodynamic limit at P N, the basin of attraction of the stationary states vanishes. Thus, asymptotically, the basing of attraction of the stationary state coincides with the stationary state itself.…”
Section: Introductionmentioning
confidence: 88%
“…We thus bound the maximum number of storable points with Ω TE [13]. Unfortunately, as pointed out by Rocchi et al [20], the basin of attraction of these points shrinks to the point that it contains only the stable states. In the lower panel, we show how, with our approach, we overcome this limit.…”
Section: Storing Patternsmentioning
confidence: 93%
“…For example, the pseudo-inverse rule (also called projection rule) can increase HNNs storage capacity and improve accuracy (Wu et al, 2012;Sahoo et al, 2016), but is not local nor incremental. Moreover, recent works have explored learning rules with self-feedback connections (non-0 diagonal), and have shown higher accuracy for a high number of stored patterns (Liou and Yuan, 1999;Folli et al, 2017;Rocchi et al, 2017;Gosti et al, 2019). In summary, despite the present limitations of the ONN, features in terms of FPS, computation time and training, are encouraging toward the exploration of a wider range of applications.…”
Section: Limitations and Future Directionsmentioning
confidence: 99%
“…When a neural system is used as an associated memory, the stable equilibria of the neural network correspond to the static retrievable memory. So far, many scholars devoted their efforts to establish a neural network with coexisting multiple equilibria [12][13][14], which denotes the storage capacity of a neural system [15,16]. For the low-dimensional Hopfield neural network system, the existing dynamical analysis is focused on the local stability and Hopf bifurcation of the trivial equilibrium [17][18][19].…”
Section: Introductionmentioning
confidence: 99%