2022
DOI: 10.1109/tcad.2022.3197512
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Hardware Acceleration of Sparsely Active Convolutional Spiking Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 36 publications
0
6
0
Order By: Relevance
“…This combination is very powerful, as it harnesses the power of a convolutional filter alongside the spiking mechanism of IF or LIF neurons. When using low-power FPGA boards, as in several studies [ 49 , 51 , 52 , 53 , 54 , 55 , 56 ], it is challenging to balance both the deeper convolutions and spiking mechanisms. The current study was able to overcome several of the challenges faced by other works because of the following: We hosted deeper convolutions alongside SNNs with very few parameters compared to [ 49 ] and were still able to achieve similar accuracy over the MNIST and CIFAR10 datasets.…”
Section: Literature Reviewmentioning
confidence: 99%
See 4 more Smart Citations
“…This combination is very powerful, as it harnesses the power of a convolutional filter alongside the spiking mechanism of IF or LIF neurons. When using low-power FPGA boards, as in several studies [ 49 , 51 , 52 , 53 , 54 , 55 , 56 ], it is challenging to balance both the deeper convolutions and spiking mechanisms. The current study was able to overcome several of the challenges faced by other works because of the following: We hosted deeper convolutions alongside SNNs with very few parameters compared to [ 49 ] and were still able to achieve similar accuracy over the MNIST and CIFAR10 datasets.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The current study was able to overcome several of the challenges faced by other works because of the following: We hosted deeper convolutions alongside SNNs with very few parameters compared to [ 49 ] and were still able to achieve similar accuracy over the MNIST and CIFAR10 datasets. We employed both real-valued and Poisson distribution spikes as input encoding schemes to capture most of the information before processing them through DCSNNs, which were not used in [ 49 , 51 , 52 , 53 , 54 , 55 , 56 ]. We tested the DCSNNs on automotive relevant datasets such as KITTI, INHA_ADAS, and INHA_KLP as opposed to just MNIST and CIFAR10, as was the case in [ 49 , 52 , 54 , 55 , 56 ].…”
Section: Literature Reviewmentioning
confidence: 99%
See 3 more Smart Citations