2022
DOI: 10.48550/arxiv.2210.05961
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The computational and learning benefits of Daleian neural networks

Abstract: Dale's principle implies that biological neural networks are composed of neurons that are either excitatory or inhibitory. While the number of possible architectures of such Daleian networks is exponentially smaller than non-Daleian ones, the computational and functional implications of using Daleian networks by the brain are mostly unknown. Here, we use models of recurrent spiking neural networks and rate-based networks to show, surprisingly, that despite the structural limitations on Daleian networks, they c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 26 publications
0
2
0
Order By: Relevance
“…Moreover, learning such network models would be of key interest, both as a potential way to improve learning in artificial neural networks using biological features, and as a way for biological neural networks to implement efficient learning and overcome the credit assignment problem (33)(34)(35)(36)(37)(38). Both structured architectural features of neural circuits and random connectivity patterns have been suggested to enable or shape the computation carried out by neural circuits (30,(39)(40)(41)(42)(43). In particular, these computations rely on the nature of synaptic connectivity and the coupling between synapses in terms of how they change during learning.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, learning such network models would be of key interest, both as a potential way to improve learning in artificial neural networks using biological features, and as a way for biological neural networks to implement efficient learning and overcome the credit assignment problem (33)(34)(35)(36)(37)(38). Both structured architectural features of neural circuits and random connectivity patterns have been suggested to enable or shape the computation carried out by neural circuits (30,(39)(40)(41)(42)(43). In particular, these computations rely on the nature of synaptic connectivity and the coupling between synapses in terms of how they change during learning.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, they may be useful for improving learning in artificial neural networks using biological features (33)(34)(35)(36)(37)(38). Both structured architectural features of neural circuits and random connectivity patterns have been suggested to shape the computation carried out by neural circuits (30,(39)(40)(41)(42)(43). These computations rely on the nature of synaptic connectivity and the coupling between synapses in terms of how they change during learning.…”
Section: Introductionmentioning
confidence: 99%