2022
DOI: 10.1101/2022.10.19.512820
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Training stochastic stabilized supralinear networks by dynamics-neutral growth

Abstract: There continues to be a trade-off between the biological realism and performance of neural networks. Contemporary deep learning techniques allow neural networks to be trained to perform challenging computations at (near) human-level, but these networks typically violate key biological constraints. More detailed models of biological neural networks can incorporate many of these constraints but typically suffer from subpar performance and trainability. Here, we narrow this gap by developing an effective method f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 34 publications
0
5
0
Order By: Relevance
“…= J * (w − w * ), [21] where w * is zero, by definition, and J * is the Jacobian evaluated at the fixed point. The Jacobian is defined as…”
Section: Stability Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…= J * (w − w * ), [21] where w * is zero, by definition, and J * is the Jacobian evaluated at the fixed point. The Jacobian is defined as…”
Section: Stability Analysismentioning
confidence: 99%
“…Theoretical work has highlighted the experimentally observed balance of stimulus selective excitatory and inhibitory input currents as a critical requirement for many neural computations (11)(12)(13)(14)(15)(16). For example, recent models based on balanced E-I networks can explain a wide range of cortical phenomena, such as cross-orientation and surround suppression (17,18), as well as stimulus-induced neural variability (19)(20)(21). A major caveat of these models is that the network connectivity is usually static and designed by hand, albeit based on experimental measurements.…”
mentioning
confidence: 99%
“…Other common choices include softmax, softplus or linear, depending on the decoding objective. These models reflect the diversity in function across the brain areas they model, such perceptual functions in early sensory areas such as the visual cortex [9], maintenance of items in working memory, decision-making, context-dependent or rule-dependent responses in higher-order cortical areas such as the prefrontal cortex [14,16,19], and motor control of eye saccades or arm movements in motor cortical areas [10][11][12][13]. Consequently, the task structure, requisite computations, RNN inputs and outputs vary widely according to the function.…”
Section: Biologically Plausible Recurrent Neural Networkmentioning
confidence: 99%
“…First, the brain's connectivity is highly recurrent at multiple spatial scales, making recurrent neural networks (RNNs) [5] the modeling tool of choice in neuroscience [2,4,6]. The training and analysis of RNNs has advanced theory on the neural basis of perception [7][8][9], motor behavior [10][11][12][13], cognition [14][15][16][17][18][19][20][21][22][23][24][25][26][27], memory and navigation [28,29]. Second, physiological properties of neurons places a greater demand on the emergent computational properties of neural populations.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation