2004
DOI: 10.1088/0305-4470/37/31/002
|View full text |Cite
|
Sign up to set email alerts
|

Slowly evolving connectivity in recurrent neural networks: I. The extreme dilution regime

Abstract: We study extremely diluted spin models of neural networks in which the connectivity evolves in time, although adiabatically slowly compared to the neurons, according to stochastic equations which on average aim to reduce frustration. The (fast) neurons and (slow) connectivity variables equilibrate separately, but at different temperatures. Our model is exactly solvable in equilibrium. We obtain phase diagrams upon making the condensed ansatz (i.e. recall of one pattern). These show that, as the connectivity te… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
16
0

Year Published

2004
2004
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(19 citation statements)
references
References 37 publications
(71 reference statements)
3
16
0
Order By: Relevance
“…To get a better understanding of this effect, we also compute the fraction of misaligned spins and the fraction of sites with vanishing local fields as functions of the ratio of the characteristic temperatures of the two dynamic processes. In the limit of extreme dilution our expressions reproduce the results of [26].…”
Section: Introductionsupporting
confidence: 76%
See 1 more Smart Citation
“…To get a better understanding of this effect, we also compute the fraction of misaligned spins and the fraction of sites with vanishing local fields as functions of the ratio of the characteristic temperatures of the two dynamic processes. In the limit of extreme dilution our expressions reproduce the results of [26].…”
Section: Introductionsupporting
confidence: 76%
“…The present model is an extension of the one presented in [26] where geometry was of an adaptive nature. There, one could make important analytic simplifications owing to the choice of 'extreme dilution' scaling whereby each node in the graph was connected on average to a vanishing fraction of other nodes although this fraction still contained an infinite number of nodes.…”
Section: Model Definitionsmentioning
confidence: 99%
“…It depends on the replica dimension n via the scalar spin product and it reduces to c = c in the limit n → 0. In the limit c → ∞ (scaling J as J/c to keep the local fields in the graph h gr i (σ) ≡ j c ij J ij σ j of O(1)) we again recover c = c to leading order as found in [15]. Similarly to the above, one also finds that taking H s → H s + λ i<j c ij J ij produces the average bond strength on the graph.…”
Section: B Connectivity System Observablessupporting
confidence: 68%
“…Unfortunately, to measure this directly in our system where bonds are mobile would require us to be able to measure correlations over long length scales within the system (in fact scaling like the average loop length ∼ log(N)), which is technically difficult. To try and finesse this problem, in [15,16], the fraction of misaligned spins was calculated, i.e. the fraction of spins that did not point in the direction of their local field.…”
Section: A Spin System Observablesmentioning
confidence: 99%
“…The analysis thus allows for a specific characterisation of the effects of such parameters on the stability or otherwise of the system, and on its dynamical behaviour. In subsequent work one can then build on this approach and add more realism by allowing the graph itself evolve in time [50,51].…”
Section: Paradigm Of Random Network Modelsmentioning
confidence: 99%