1997
DOI: 10.1088/0305-4470/30/22/019
|View full text |Cite
|
Sign up to set email alerts
|

Neural networks with fast time-variation of synapses

Abstract: Abstract. We study a kinetic neural network in which the intensity of synaptic couplings varies on a timescale of order p(1 − p) −1 compared with that for neuron variations. We describe some exact and mean-field results for p → 0. This includes, for example, the Hopfield model with random fluctuations of synapse intensities such that neurons couple each other, on average, according to the Hebbian learning rule. The consequences of such fluctuations on the performance of the network are analysed in detail for s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

2
26
0

Year Published

1998
1998
2012
2012

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(28 citation statements)
references
References 22 publications
2
26
0
Order By: Relevance
“…Also remarkable is the result that we report in Fig. 3, which is also fully consistent with our Monte Carlo observations [16]. The solutions (10) correspond, in general, to a saddle point whose details strongly depend on , so that speciÿc analysis are required.…”
supporting
confidence: 89%
See 1 more Smart Citation
“…Also remarkable is the result that we report in Fig. 3, which is also fully consistent with our Monte Carlo observations [16]. The solutions (10) correspond, in general, to a saddle point whose details strongly depend on , so that speciÿc analysis are required.…”
supporting
confidence: 89%
“…It is likely that a similar conclusion, which one expects to hold beyond the Hopÿeld-model scenario, applies also to the behavior of biological systems. We report in this paper on the most general results from our study; further technical details and related numerical work are to be reported elsewhere [16].…”
mentioning
confidence: 99%
“…This was first studied in some detail in Refs. [17,18], and analysis of the intriguing behavior this may show concerning recognition was first reported in Refs. [19,20].…”
mentioning
confidence: 80%
“…This situation has been formally discussed in detail in Refs. (Torres et al, 1997;Marro and Dickman, 1999). It may be noticed that, consistently with the choice of a binary, ±1 code for the neurons activity, we are assuming zero thresholds, θ i = 0, ∀i, in the following; this is relevant when comparing this work with some related one, as discussed below.…”
Section: The Modelmentioning
confidence: 99%