2008
DOI: 10.1016/j.physa.2007.09.047
|View full text |Cite
|
Sign up to set email alerts
|

Transient dynamics of sparsely connected Hopfield neural networks with arbitrary degree distributions

Abstract: Using probabilistic approach, the transient dynamics of sparsely connected Hopfield neural networks is studied for arbitrary degree distributions. A recursive scheme is developed to determine the time evolution of overlap parameters. As illustrative examples, the explicit calculations of dynamics for networks with binomial, power-law, and uniform degree distribution are performed. The results are good agreement with the extensive numerical simulations. It indicates that with the same average degree, there is a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
8
0

Year Published

2008
2008
2015
2015

Publication Types

Select...
6
1

Relationship

4
3

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 34 publications
1
8
0
Order By: Relevance
“…In Refs. [27,28], models with asymmetric synapses and infinite connectivity are studied using SNA. The GFA method is applied in Refs.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In Refs. [27,28], models with asymmetric synapses and infinite connectivity are studied using SNA. The GFA method is applied in Refs.…”
Section: Discussionmentioning
confidence: 99%
“…Our method could be applied to extend the results in Refs. [27,28] to models with symmetric synapses. The path probabilities have an important role in the solutions of models with finite connectivity.…”
Section: Discussionmentioning
confidence: 99%
“…A large amount of studies have been devoted to trying to increase capacity and the retrieval abilities of neural networks by optimizing the learning rule or by optimizing the topology of the network [23][24][25][26][27][28][29], In the following text of the paper we are going to discuss optimizing the topology of the network. We know that a fully connected Hopfield network can store a number of patterns proportional to the number ol neurons, but the fully connected topology is not biologicaly realistic.…”
Section: B Controlling the Hopfield Network Using The Nonbacktrackimentioning
confidence: 99%
“…A large amount of studies have been devoted to increase capacity and the retrieval abilities of neural networks by optimizing the learning rule or by optimizing the topology of the network [30][31][32][33][34][35][36]. In the following text of the paper we are going to discuss more on optimizing the topology of the network.…”
Section: Non-backtracking Operator For Hopfield Model and Controlling...mentioning
confidence: 99%