2015
DOI: 10.1016/j.neunet.2015.02.010
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical neural networks perform both serial and parallel processing

Abstract: In this work we study a Hebbian neural network, where neurons are arranged according to a hierarchical architecture such that their couplings scale with their reciprocal distance. As a full statistical mechanics solution is not yet available, after a streamlined introduction to the state of the art via that route, the problem is consistently approached through signal-to-noise technique and extensive numerical simulations. Focusing on the low-storage regime, where the amount of stored patterns grows at most log… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
20
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 21 publications
(20 citation statements)
references
References 46 publications
(81 reference statements)
0
20
0
Order By: Relevance
“…Here, no O(λ)-correction was needed, since the denominator of Eq. (22) is already finite at λ = 0. In contrast, the numerator cancels at λ = 0 and yields instead…”
Section: A Simple Example: One-dimensional Latticesmentioning
confidence: 97%
“…Here, no O(λ)-correction was needed, since the denominator of Eq. (22) is already finite at λ = 0. In contrast, the numerator cancels at λ = 0 and yields instead…”
Section: A Simple Example: One-dimensional Latticesmentioning
confidence: 97%
“…that is, in the thermodynamic limit, the weighted degree has a logarithmical divergence with N (we recall that N = 2 K ); coherently, the case σ = 1/2 is excluded from the statistical-mechanics investigations [2,3]. The last part of this section is devoted to the study of the network modularity and clustering.…”
Section: Graph Generation In the Hierarchical Ferromagnetmentioning
confidence: 99%
“…Therefore, despite the network we are considering is fully connected, when noise is present weaker weights, with J ij < T , basically do not play any longer, as if they were missing [7]. Since in the statistical mechanical analysis the noise level can be tuned arbitrarily [2,3], it is crucial to understand how the overall network connection and clustering are accordingly modified.…”
Section: A the Hierarchical Ferromagnet With Noise: Deterministic DImentioning
confidence: 99%
“…For instance, by taking P finite (or, still, sublinear with respect to N ) in the thermodynamic limit N → ∞, one has retrieval capabilities as long as T < 1. A modification of the Hebb rule results in a deformation of the basins of attractions: this can be done for example overlaying Hebb to another interaction structure as a diluted [45,46] or a hierarchical [47,48,49,50] structure. In order to describe the overall state of the system, one introduces the macroscopic observable m, also called Mattis magnetization, that is a vector of length P , whose µ-th component represents the overlap between the spin configuration and the µ-th pattern:…”
Section: A Brief Review On the Hopfield Modelmentioning
confidence: 99%