2022
DOI: 10.1101/2022.11.29.518360
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The functional role of oscillatory dynamics in neocortical circuits: a computational perspective

Abstract: Biological neuronal networks have the propensity to oscillate. However, it is unclear whether these oscillations are a mere byproduct of neuronal interactions or serve computational purposes. Therefore, we implemented hallmark features of the cerebral cortex in recurrent neuronal networks (RNNs) simulated in silico and examined their performance on common pattern recognition tasks after training with a gradient-based learning rule. We find that by configuring network nodes as damped harmonic oscillators (DHOs)… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 111 publications
0
6
0
Order By: Relevance
“…Oscillatory activity in the theta, alpha, and gamma-band has been repeatedly shown to support learning and memory [for review see [33, 45]]. In line with this, Effenberger, Carvalho, Dubinin, and Singer [31] have shown that an RNN whose nodes reflect damped oscillators outperform the performance of non-oscillatory RNNs when classifying the sequential MNIST data set. For future extensions of the presented algorithm, it would be interesting to explore if and how biologically plausible dynamics could be used to support the training process.…”
Section: Discussionmentioning
confidence: 84%
See 3 more Smart Citations
“…Oscillatory activity in the theta, alpha, and gamma-band has been repeatedly shown to support learning and memory [for review see [33, 45]]. In line with this, Effenberger, Carvalho, Dubinin, and Singer [31] have shown that an RNN whose nodes reflect damped oscillators outperform the performance of non-oscillatory RNNs when classifying the sequential MNIST data set. For future extensions of the presented algorithm, it would be interesting to explore if and how biologically plausible dynamics could be used to support the training process.…”
Section: Discussionmentioning
confidence: 84%
“…For instance, in the form of neural ODEs [20], liquid-time constant neural networks [47,48], and RNNs consisting of damped oscillators [31]. These networks had great success in learning long-range dependencies in time series data [20,47,48], sequences of images [80], and image classification when the pixels of the input are transformed into a time series (sequential MNIST) [31,47]. The goal of our work, in comparison, was to convert spatially presented inputs into a time series.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Recurrently coupled networks process signals in a highly parallel manner and represent computational results in complex dynamical landscapes to which all nodes of the network contribute continuously. Any local signal spreads immediately over the whole network ( Effenberger et al. 2022 ).…”
Section: Discussionmentioning
confidence: 99%