2019
DOI: 10.3389/fninf.2019.00019
|View full text |Cite
|
Sign up to set email alerts
|

Communication Sparsity in Distributed Spiking Neural Network Simulations to Improve Scalability

Abstract: In the last decade there has been a surge in the number of big science projects interested in achieving a comprehensive understanding of the functions of the brain, using Spiking Neuronal Network (SNN) simulations to aid discovery and experimentation. Such an approach increases the computational demands on SNN simulators: if natural scale brain-size simulations are to be realized, it is necessary to use parallel and distributed models of computing. Communication is recognized as the dominant part of distribute… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 50 publications
0
1
0
Order By: Relevance
“…Highly power efficient operation is a significant advantage, which translates to low computing requirements in constrained devices. Neural activations are known to be sparse in SNNs [17]. Sparseness in spike communication leads to smaller bandwidth requirements, and further energy efficiency by decreasing the time of radio transmissions.…”
Section: Introductionmentioning
confidence: 99%
“…Highly power efficient operation is a significant advantage, which translates to low computing requirements in constrained devices. Neural activations are known to be sparse in SNNs [17]. Sparseness in spike communication leads to smaller bandwidth requirements, and further energy efficiency by decreasing the time of radio transmissions.…”
Section: Introductionmentioning
confidence: 99%