2016
DOI: 10.1371/journal.pcbi.1004992
|View full text |Cite
|
Sign up to set email alerts
|

The Complexity of Dynamics in Small Neural Circuits

Abstract: Mean-field approximations are a powerful tool for studying large neural networks. However, they do not describe well the behavior of networks composed of a small number of neurons. In this case, major differences between the mean-field approximation and the real behavior of the network can arise. Yet, many interesting problems in neuroscience involve the study of mesoscopic networks composed of a few tens of neurons. Nonetheless, mathematical methods that correctly describe networks of small size are still rar… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

3
66
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 21 publications
(69 citation statements)
references
References 104 publications
(211 reference statements)
3
66
0
Order By: Relevance
“…Note that N E,I can be arbitrary, but for illustrative purposes in this section we consider the case N E = N I (rather than the N E \N I = 4 ratio experimentally observed in biological systems [25]), because we found that the network complexity increases with the size of the inhibitory population. Interestingly, the same phenomenon was found to occur also in multi-population networks with graded activation function [9]. Moreover, without further loss of generality, we index the neurons of the excitatory population as i = 0, .…”
Section: Examples Of Network Topologiessupporting
confidence: 56%
See 2 more Smart Citations
“…Note that N E,I can be arbitrary, but for illustrative purposes in this section we consider the case N E = N I (rather than the N E \N I = 4 ratio experimentally observed in biological systems [25]), because we found that the network complexity increases with the size of the inhibitory population. Interestingly, the same phenomenon was found to occur also in multi-population networks with graded activation function [9]. Moreover, without further loss of generality, we index the neurons of the excitatory population as i = 0, .…”
Section: Examples Of Network Topologiessupporting
confidence: 56%
“…(3) we provided examples that show how the network dynamics depends on the external stimuli, the network size and the topology of the synaptic connections. Similarly to the case of graded firing-rate network models [3][4][5]9,15,27], this analysis revealed a complex bifurcation structure, encompassing several changes in the degree of multistability of the network, oscillations with stimulus-dependent frequency, and various forms of spontaneous symmetrybreaking of the neural activity.…”
Section: Discussionmentioning
confidence: 82%
See 1 more Smart Citation
“…For this reason, it is important to be able to study mathematically the dynamics of binary neural network models (or of any network model, see e.g. [13,28,63]), for a wide range of different network sizes.Large-scale networks composed of several thousands of neurons or more, are typically studied by taking advantage of the powerful techniques of statistical mechanics, such as the law of large numbers and the central limit theorem, see e.g. [21,29,33,74].…”
mentioning
confidence: 99%
“…For this reason, it is important to be able to study mathematically the dynamics of binary neural network models (or of any network model, see e.g. [13,28,63]), for a wide range of different network sizes.…”
mentioning
confidence: 99%