2017 IEEE International Symposium on Circuits and Systems (ISCAS) 2017
DOI: 10.1109/iscas.2017.8050527
|View full text |Cite
|
Sign up to set email alerts
|

Fast classification using sparsely active spiking networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
36
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 39 publications
(36 citation statements)
references
References 14 publications
0
36
0
Order By: Relevance
“…These approaches, however, are based on spiking networks with rate coding, which typically require more memory accesses and longer processing time for each training pattern compared to BSNs. An alternative training approach based on exact backpropagation and temporal coding in spiking networks (Mostafa, 2016 ; Mostafa et al, 2017 ) has been shown to lead to highly sparse spiking activity during training and inference, and could potentially be more energetically efficient than training BSNs using the approach presented in this paper.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…These approaches, however, are based on spiking networks with rate coding, which typically require more memory accesses and longer processing time for each training pattern compared to BSNs. An alternative training approach based on exact backpropagation and temporal coding in spiking networks (Mostafa, 2016 ; Mostafa et al, 2017 ) has been shown to lead to highly sparse spiking activity during training and inference, and could potentially be more energetically efficient than training BSNs using the approach presented in this paper.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Rate or time coding have significantly different impacts on such overheads. Using rate coding, a larger amount of spikes, with respect to time coding, is sent through the network and the overall system power consumption drastically increases, as shown in [130]. On the other hand, rate coding usually employs Poisson spike trains and an individual spike timing error is thus of small consequence for the precision of the network.…”
Section: Reduced Communication Costmentioning
confidence: 99%
“…Many argue that the main advantage of SNNs is that for a given input data, due to the event driven nature and the thresholding of every units, the evaluation will usually not require every neuron of every layer to fire [130]. As such, computation is sparse and the evaluation of the network on the input data requires less operations compared to the ANNs' frame-driven approach, where evaluation of every neurons of the network is required for every input data.…”
Section: Leverage High Sparsity Of Dataflowmentioning
confidence: 99%
See 1 more Smart Citation
“…We also compare our work with a wider range of implementations, including custom ASIC chips [8,41,50,59], neural processing units [18], spiking neural networks [14,28,42], crossbar implementations [57], and CPU/GPU-based solutions of the DropConnect approach [58] (the most accurate approach for MNIST to date; data is measured via i7-5820K, 32GB DDR3 with Nvidia Titan). Fig.…”
Section: Comparison To Other Mnist Implementationsmentioning
confidence: 99%