2019
DOI: 10.1371/journal.pone.0214541
|View full text |Cite
|
Sign up to set email alerts
|

Weight statistics controls dynamics in recurrent neural networks

Abstract: Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths w ij between the individual neurons. However, little is known to which extent network dynamics is tunable on a more coarse-grained level by the statistical features of the w… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
22
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
4

Relationship

5
4

Authors

Journals

citations
Cited by 27 publications
(24 citation statements)
references
References 37 publications
1
22
0
Order By: Relevance
“…Consistently with these observations, there is a positive correlation ( r = 0.11, p = 5.7 · 10 −10 ) with the sum msubi,jwi,j of all nine weights in a motif. This quantity is similar to a previously defined network weight statistics parameter called “balance” which has been identified to have the largest impact on the dynamical behavior of recurrent neural networks (Krauss et al, 2019).…”
Section: Resultssupporting
confidence: 58%
See 1 more Smart Citation
“…Consistently with these observations, there is a positive correlation ( r = 0.11, p = 5.7 · 10 −10 ) with the sum msubi,jwi,j of all nine weights in a motif. This quantity is similar to a previously defined network weight statistics parameter called “balance” which has been identified to have the largest impact on the dynamical behavior of recurrent neural networks (Krauss et al, 2019).…”
Section: Resultssupporting
confidence: 58%
“…Recurrent neural networks (RNN) with apparently random connections occur ubiquitously in the brain (Middleton and Strick, 2000; Song et al, 2005). They can be viewed as complex non-linear systems, capable of ongoing activity even in the absence of driving inputs, and they show rich dynamics, including oscillatory, chaotic, and stationary fixed point behavior (Krauss et al, 2019). Recently RNNs gain popularity in bio-inspired approaches of neural information processing, such as reservoir computing (Schrauwen et al, 2007; Verstraeten et al, 2007; Lukoševičius and Jaeger, 2009).…”
Section: Introductionmentioning
confidence: 99%
“…It is therefore not very surprising that biological neural networks are also highly recurrent in their connectivity (Binzegger et al, 2004 ; Squire et al, 2012 ), so that RNN models play an important role in neuroscience research as well Barak ( 2017 ) and Maheswaranathan et al ( 2019 ). Modeling natural RNNs in a realistic way requires the use of probabilistic, spiking neurons, but even simpler models with deterministic neurons already have highly complex dynamical properties and offer fascinating insights into how structure controls function in non-linear systems (Krauss et al, 2019b , c ). For example, we have demonstrated that by adjusting the density d of non-zero connections and the balance b between excitatory and inhibitory connections in the RNN's weight matrix, it is possible to control whether the system will predominantly end up in a periodic, chaotic, or fixed point attractor (Krauss et al, 2019b ).…”
Section: Introductionmentioning
confidence: 99%
“…Additionally, the results of this study have the potential to provide novel insights in the function of biological neural networks (Schemmel, Grubl, Meier, & Mueller, 2006;Jin et al, 2010). Thus, we think that the application of he analysis techniques developed for untrained or randomly connected neural networks such as stability analysis (Liapunov exponent) or motif distribution analysis (see Bertschinger & Natschläger, 2004; U n c o r r e c t e d P r o o f LIF Units in Deep Learning Krauss, Prebeck, Schilling, & Metzner, 2019;Krauss, Schuster et al, 2019), can be applied to the spiking neural networks trained with backpropagation to gain new insights in brain dynamics and function.…”
Section: Discussionmentioning
confidence: 99%