2022
DOI: 10.3389/fncom.2022.876315
|View full text |Cite
|
Sign up to set email alerts
|

Dynamics and Information Import in Recurrent Neural Networks

Abstract: Recurrent neural networks (RNNs) are complex dynamical systems, capable of ongoing activity without any driving input. The long-term behavior of free-running RNNs, described by periodic, chaotic and fixed point attractors, is controlled by the statistics of the neural connection weights, such as the density d of non-zero connections, or the balance b between excitatory and inhibitory connections. However, for information processing purposes, RNNs need to receive external input signals, and it is not clear whic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
22
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(23 citation statements)
references
References 65 publications
1
22
0
Order By: Relevance
“…Our study provides evidence that an interplay of deep learning and neuroscience helps on the one hand to raise understanding of the function of biological neural networks and cognition in general (e.g., Schilling et al, 2018 , 2021b ; Krauss et al, 2019a , c , d , 2021 ; Gerum et al, 2020 ; Krauss and Maier, 2020 ; Bönsel et al, 2021 ; Metzner and Krauss, 2022 ), an emerging science strand referred to as cognitive computational neuroscience ( Kriegeskorte and Douglas, 2018 ). On the other hand, fundamental processing principles from nature—such as stochastic resonance—can be transferred to improve artificial neural systems, which is called neuroscience-inspired AI ( Hassabis et al, 2017 ; Gerum et al, 2020 ; Gerum and Schilling, 2021 ; Yang et al, 2021 ; Maier et al, 2022 ).…”
Section: Discussionmentioning
confidence: 89%
“…Our study provides evidence that an interplay of deep learning and neuroscience helps on the one hand to raise understanding of the function of biological neural networks and cognition in general (e.g., Schilling et al, 2018 , 2021b ; Krauss et al, 2019a , c , d , 2021 ; Gerum et al, 2020 ; Krauss and Maier, 2020 ; Bönsel et al, 2021 ; Metzner and Krauss, 2022 ), an emerging science strand referred to as cognitive computational neuroscience ( Kriegeskorte and Douglas, 2018 ). On the other hand, fundamental processing principles from nature—such as stochastic resonance—can be transferred to improve artificial neural systems, which is called neuroscience-inspired AI ( Hassabis et al, 2017 ; Gerum et al, 2020 ; Gerum and Schilling, 2021 ; Yang et al, 2021 ; Maier et al, 2022 ).…”
Section: Discussionmentioning
confidence: 89%
“…Unfortunately, numerically estimating the mutual information from samples of two distributions is only approximate, computationally expensive and becomes practical only when one makes additional assumptions about the distributions. In practice one will often take resort to correlational measures which sometimes provide qualitatively similar insights as mutual information (Metzner & Krauss, 2021).…”
Section: Timescales Of Memorymentioning
confidence: 99%
“…By color-coding each projected data point of a data set according to its label, the representation of the data can be visualized as a set of point clusters. For instance, MDS has already been applied to visualize for instance word class distributions of different linguistic corpora 23 , hidden layer representations (embeddings) of artificial neural networks 6,15,24 , structure and dynamics of recurrent neural networks [25][26][27][28] , or brain activity patterns assessed during e.g. pure tone or speech perception 14,23,29 , or even during sleep 5,30 .…”
Section: Control Of Class Separation By Quantity Smentioning
confidence: 99%