1987
DOI: 10.1109/tit.1987.1057328
|View full text |Cite
|
Sign up to set email alerts
|

The capacity of the Hopfield associative memory

Abstract: -Techniques from,coding theory are applied to study rigorously the capacity of the Hopfield associative memory. Such a memory stores n -tuple of + 1's. The components change depending on a hardlimited version of linear functions of all other components. With symmetric connections between components, a stable state is ultimately reached. By building up the connection matrix as a sum-of-outer products of m fundamental memories, one hopes to be able to recover a certain one of the no memories by using an initial … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

7
316
0
5

Year Published

1998
1998
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 798 publications
(328 citation statements)
references
References 14 publications
7
316
0
5
Order By: Relevance
“…However, for these, the asymptotic capacity always goes to zero asymptotically for a error criterion for retrieval that demands vanishing errors. For instance, for the Hopfield model with linear learning, the number of patterns per neuron is , and for clipped learning, it is [19]. If a general nonlinear function is used in place of the Heaviside function for weight saturation, then, in general, the information capacity stays below that of linear learning [20].…”
Section: Asymptotic Capacity Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, for these, the asymptotic capacity always goes to zero asymptotically for a error criterion for retrieval that demands vanishing errors. For instance, for the Hopfield model with linear learning, the number of patterns per neuron is , and for clipped learning, it is [19]. If a general nonlinear function is used in place of the Heaviside function for weight saturation, then, in general, the information capacity stays below that of linear learning [20].…”
Section: Asymptotic Capacity Resultsmentioning
confidence: 99%
“…Again, arbitrary offsets depending on the initial pattern do not change the system dynamics. Comparing (7) with (19), the MAP Lyapunov function suggests the constraint coefficients (20) (21) with the offset The resulting coefficients are positive if the initial pattern is closer to the training pattern than the inverted initial pattern, i.e., . Modifications similar to this have previously been proposed for other memory models on a heuristic basis.…”
Section: The Influence Of the Initial Patternmentioning
confidence: 99%
“…One could use a random agglomerate of particles, all connected together, provided that the percolation is adequate (there are sufficient electrically connected paths through the network). By using proper encoding and data detection algorithms, information storage can be stored in an associative fashion [13,14].…”
Section: Three-dimensional Cross-point Structuresmentioning
confidence: 99%
“…So, the first and most basic question for this model is whether the patterns are fixed points of the dynamics: this question was first considered for i.i.d. symmetric Bernoulli patterns by McEliece et al [30]. We will adopt their definition of storage capacity as the maximum number of patterns M that are stable under the dynamics described above, with a probability converging to one as N → +∞.…”
Section: Capacity Of the Hopfield Model With A Nonmonotonic Dynamicsmentioning
confidence: 99%