2018
DOI: 10.1101/475319
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Weight statistics controls dynamics in recurrent neural networks

Abstract: Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths w ij between the individual neurons. However, little is known to which extent network dynamics is tunable on a more coarse-grained level by the statistical features of the weight matrix. In this work, we investigate the dynamical impact… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
13
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
1
1

Relationship

4
2

Authors

Journals

citations
Cited by 9 publications
(13 citation statements)
references
References 25 publications
0
13
0
Order By: Relevance
“…for creating so-called embeddings of the raw data [122]. Moreover, as proposed by Kriegeskorte [123], our neural corpus can serve to test [124] computational models of brain function [125][126][127][128], in particular models based on neural networks [129][130][131] and machine learning architectures [132,133], in order to iteratively increase biological and cognitive fidelity [123].…”
Section: Discussionmentioning
confidence: 99%
“…for creating so-called embeddings of the raw data [122]. Moreover, as proposed by Kriegeskorte [123], our neural corpus can serve to test [124] computational models of brain function [125][126][127][128], in particular models based on neural networks [129][130][131] and machine learning architectures [132,133], in order to iteratively increase biological and cognitive fidelity [123].…”
Section: Discussionmentioning
confidence: 99%
“…In this study, we investigated RNNs with random weight matrices, their ability to import and process information, and how both abilities depend on the density of non-zero weights and on the balance of excitatory and inhibitory connections, as introduced in previous studies [20,21].…”
Section: Discussionmentioning
confidence: 99%
“…It is therefore not very surprising that biological neural networks are also highly recurrent in their connectivity [16,17] making RNNs to versatile tools of neuroscience research [18,19]. Modelling natural RNNs in a realistic way requires the use of probabilistic, spiking neurons, but even simpler models with deterministic neurons already have highly complex dynamical properties and offer fascinating insights into how structure controls function in non-linear systems [20,21]. For example, we have demonstrated that by adjusting the density d of non-zero connections and the balance b between excitatory and inhibitory connections in the RNN's weight matrix, it is possible to control whether the system will predominantly end up in a periodic, chaotic, or fixed point attractor [21].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In information processing, networks take different tasks of functionality [37,38]. Thus, a better understanding of their structure and connectivity should shed more light on the dynamics of the phenomena occurring on them [39]. It is well-known that large recurrent networks can be decomposed into smaller building blocks -the so-called motifs [40], whereby threeneuron motifs (3NMs) are the most basic motifs, which frequently appear in neural circuits and can be seen as basic computational units [41], each uniquely contributing to a large-scale neural behavior [42,43].…”
Section: Introductionmentioning
confidence: 99%