2021
DOI: 10.1371/journal.pcbi.1009202
|View full text |Cite
|
Sign up to set email alerts
|

Dendritic normalisation improves learning in sparsely connected artificial neural networks

Abstract: Artificial neural networks, taking inspiration from biological neurons, have become an invaluable tool for machine learning applications. Recent studies have developed techniques to effectively tune the connectivity of sparsely-connected artificial neural networks, which have the potential to be more computationally efficient than their fully-connected counterparts and more closely resemble the architectures of biological systems. We here present a normalisation, based on the biophysical behaviour of neuronal … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 59 publications
(110 reference statements)
0
4
0
Order By: Relevance
“…Jones and Kording 91 used a binary tree to approximate dendrite branching and provided valuable insights into the influence of tree structure on single neurons’ computational capacity. Bird et al 92 . proposed a dendritic normalization rule based on biophysical behavior, offering an interesting perspective on the contribution of dendritic arbor structure to computation.…”
Section: Discussionmentioning
confidence: 99%
“…Jones and Kording 91 used a binary tree to approximate dendrite branching and provided valuable insights into the influence of tree structure on single neurons’ computational capacity. Bird et al 92 . proposed a dendritic normalization rule based on biophysical behavior, offering an interesting perspective on the contribution of dendritic arbor structure to computation.…”
Section: Discussionmentioning
confidence: 99%
“…The output spiking rate R can then be approximated by R ≈ IF syn + c, where IF syn represents the input fraction of activated synapses (as a percentage of total postsynapses) and c is a constant (27). Such connectivity and morphology independent point-like neurons endow networks with the capacity to maintain functional robustness due to normalisation effects (28)(29)(30), paralleling methods employed in machine learning (31,32).…”
Section: Synaptic Density Convergence Across Speciesmentioning
confidence: 99%
“…Because of the additional nonlinearity compared to a model point neuron, better expressibility can be expected (Wu et al, 2018 ), and electrical compartmentalization and active dendritic properties can be applied to ANNs (Chavlis and Poirazi, 2021 ; Iyer et al, 2022 ; Sezener et al, 2022 ). The segregated electrical properties also indicate that homeostatic control can occur separately in distinct dendritic branches (Tripodi et al, 2008 ; Bird et al, 2021 ; Shen et al, 2021 ). Such an adjustment of weights in each dendritic branch toward a certain homeostatic level is similar to the normalization step in ANN (Shen et al, 2021 ), which also improves learning in sparsely connected neural networks, such as BNN (Bird et al, 2021 ).…”
Section: Outcome Of Optimization: Single Computational Unit Propertiesmentioning
confidence: 99%
“…The segregated electrical properties also indicate that homeostatic control can occur separately in distinct dendritic branches (Tripodi et al, 2008 ; Bird et al, 2021 ; Shen et al, 2021 ). Such an adjustment of weights in each dendritic branch toward a certain homeostatic level is similar to the normalization step in ANN (Shen et al, 2021 ), which also improves learning in sparsely connected neural networks, such as BNN (Bird et al, 2021 ). The typical structure of a cortical pyramidal neuron consists of two distinctive directions of dendritic outgrowth from the soma: basal and apical dendrites (DeFelipe and Farias, 1992 ).…”
Section: Outcome Of Optimization: Single Computational Unit Propertiesmentioning
confidence: 99%