2022
DOI: 10.48550/arxiv.2203.12437
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Efficient Hardware Acceleration of Sparsely Active Convolutional Spiking Neural Networks

Abstract: Spiking Neural Networks (SNNs) compute in an event-based matter to achieve a more efficient computation than standard Neural Networks. In SNNs, neuronal outputs (i.e. activations) are not encoded with real-valued activations but with sequences of binary spikes. The motivation of using SNNs over conventional neural networks is rooted in the special computational aspects of spike based processing, especially the very high degree of sparsity of neural output activations. Well established architectures for convent… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 33 publications
0
5
0
Order By: Relevance
“…This combination is very powerful, as it harnesses the power of a convolutional filter alongside the spiking mechanism of IF or LIF neurons. When using low-power FPGA boards, as in several studies [49,[51][52][53][54][55][56], it is challenging to balance both the deeper convolutions and spiking mechanisms. The current study was able to overcome several of the challenges faced by other works because of the following:…”
Section: Literature Reviewmentioning
confidence: 99%
See 3 more Smart Citations
“…This combination is very powerful, as it harnesses the power of a convolutional filter alongside the spiking mechanism of IF or LIF neurons. When using low-power FPGA boards, as in several studies [49,[51][52][53][54][55][56], it is challenging to balance both the deeper convolutions and spiking mechanisms. The current study was able to overcome several of the challenges faced by other works because of the following:…”
Section: Literature Reviewmentioning
confidence: 99%
“…In the tables below, the symbols + and − correspond to suitable and unsuitable BP candidates for the respective dataset. [54,55] were chosen to evaluate the performance with respect to MNIST and a single work [55] with respect to CIFAR10. The relevant hardware specifications of these works are reported in Table 1.…”
Section: Performance Analysis With Respect To Datasets On the Fpga Pl...mentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, neither chip design has a mechanism for taking into account the observed sparsity of both weights and activations in this class of networks. Progress in the field at present is rapid, and digital implementations are evolving ever more clever methods for exploiting sparsity (Delbruck & Liu, 2019;Sommer, Özkan, Keszocze, & Teich, 2022). The brute-force analog implementations described above are useful in comparing energy efficiency of digital versus analog technologies but should not be taken as a preferred way forward.…”
Section: Backward Passmentioning
confidence: 99%