1986
DOI: 10.1209/0295-5075/1/10/008
|View full text |Cite
|
Sign up to set email alerts
|

Networks of Formal Neurons and Memory Palimpsests

Abstract: PACS. 87.30. -Biophysics of neurophysiological processes. PACS. 75.10H. -Ising and other classical spin models. PACS. 64.60. -General studies of phase transitions.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

6
115
0

Year Published

1988
1988
2017
2017

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 206 publications
(121 citation statements)
references
References 7 publications
6
115
0
Order By: Relevance
“…The slow plasticity dynamics of synapses are driven by competitive and cooperative interactions consequent on the fast dynamics of firing neurons. The model is analysed within a mean-field approximation, in common with many physics-based approaches to neuroscience, ranging from earlier work [11][12][13][14][15][16][17][18][19][20] to more recent developments [55]. Such a mean-field framework is of course appropriate given the lack of knowledge of microscopic details at the neural or synaptic level.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The slow plasticity dynamics of synapses are driven by competitive and cooperative interactions consequent on the fast dynamics of firing neurons. The model is analysed within a mean-field approximation, in common with many physics-based approaches to neuroscience, ranging from earlier work [11][12][13][14][15][16][17][18][19][20] to more recent developments [55]. Such a mean-field framework is of course appropriate given the lack of knowledge of microscopic details at the neural or synaptic level.…”
Section: Discussionmentioning
confidence: 99%
“…An essential property of these models as well as real neural networks is that their capacity is finite. Forgetting is therefore an important aspect of continued learning [3,[15][16][17][18][19][20].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The main feature of the models [14,15,21] generalizing the Hebb rule for learning process is that the memory has a finite capacity and that the learning of new patterns generally implies forgetting of old ones. In this way, one avoids to reach the state of total confusion.…”
Section: A Stochastic Model For Memory Loadingmentioning
confidence: 99%
“…In the last years, after the seminal work of Hopfield [6], there was great interest in associative memory networks exhibiting statistical behavior similar to the socalled spin glass model [ 1,10,[13][14][15][16]. Probability laws were first introduced by Little [10], who interpreted the sigmo~dal kind of output of a cell to a given sequence of inputs in terms of random noise.…”
Section: Introductionmentioning
confidence: 99%