2019
DOI: 10.1371/journal.pone.0220547
|View full text |Cite
|
Sign up to set email alerts
|

Training dynamically balanced excitatory-inhibitory networks

Abstract: The construction of biologically plausible models of neural circuits is crucial for understanding the computational properties of the nervous system. Constructing functional networks composed of separate excitatory and inhibitory neurons obeying Dale’s law presents a number of challenges. We show how a target-based approach, when combined with a fast online constrained optimization technique, is capable of building functional models of rate and spiking recurrent neural networks in which excitation and inhibiti… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
70
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 47 publications
(72 citation statements)
references
References 37 publications
2
70
0
Order By: Relevance
“…Instead, some of them were relying on control theory to train a chaotic reservoir of spiking neurons [32][33][34] . Others used the FORCE algorithm 35,36 or variants of it 35,[37][38][39] . However, the FORCE algorithm was not argued to be biologically realistic, as the plasticity rule for each synaptic weight requires knowledge of the current values of all other synaptic weights.…”
Section: Discussionmentioning
confidence: 99%
“…Instead, some of them were relying on control theory to train a chaotic reservoir of spiking neurons [32][33][34] . Others used the FORCE algorithm 35,36 or variants of it 35,[37][38][39] . However, the FORCE algorithm was not argued to be biologically realistic, as the plasticity rule for each synaptic weight requires knowledge of the current values of all other synaptic weights.…”
Section: Discussionmentioning
confidence: 99%
“…Despite some fundamental limitations common to most computational models of brain activity, our approach was designed with several key features of living neuronal networks, including spiking neurons, Dale's principle, balanced excitation/inhibition, a heterogeneity of neuronal and synaptic parameters, propagation delays, and conductance-based synapses (van Vreeswijk and Sompolinsky, 1996 ; Sussillo and Abbott, 2009 ; Ingrosso and Abbott, 2019 ). We used a learning algorithm that isn't biologically plausible to train the readout unit (recursive least-square).…”
Section: Discussionmentioning
confidence: 99%
“…We used the recursive least square algorithm (Haykin, 2002) to train the readout units to produce the target functions. The W out weight matrix was updated based on the following equations:…”
Section: Learning Algorithm For the Readout Unitmentioning
confidence: 99%
See 1 more Smart Citation
“…Despite some fundamental limitations common to most computational models of brain activity, our approach was designed with several key features of living neuronal networks, including spiking neurons, Dale’s principle, balanced excitation/inhibition, a heterogeneity of neuronal and synaptic parameters, propagation delays, and conductance-based synapses (7, 11, 42).…”
Section: Discussionmentioning
confidence: 99%