2020
DOI: 10.3390/electronics9071059
|View full text |Cite
|
Sign up to set email alerts
|

Towards Efficient Neuromorphic Hardware: Unsupervised Adaptive Neuron Pruning

Abstract: To solve real-time challenges, neuromorphic systems generally require deep and complex network structures. Thus, it is crucial to search for effective solutions that can reduce network complexity, improve energy efficiency, and maintain high accuracy. To this end, we propose unsupervised pruning strategies that are focused on pruning neurons while training in spiking neural networks (SNNs) by utilizing network dynamics. The importance of neurons is determined by the fact that neurons that fire more spikes cont… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 17 publications
(15 citation statements)
references
References 22 publications
0
15
0
Order By: Relevance
“…This network has been widely adopted to study the performance of a biological SNN for different purposes, such as studying different STDP models ( Diehl and Cook, 2015 ), improving STDP learning ( Panda et al, 2018 ), and pruning ( Shi et al, 2019 ; Guo et al, 2020b ).…”
Section: Comparison and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This network has been widely adopted to study the performance of a biological SNN for different purposes, such as studying different STDP models ( Diehl and Cook, 2015 ), improving STDP learning ( Panda et al, 2018 ), and pruning ( Shi et al, 2019 ; Guo et al, 2020b ).…”
Section: Comparison and Discussionmentioning
confidence: 99%
“…• In the literature, among unsupervised two-layer SNNs, this network structure shows the best performance with an STDP learning rule only (Shi et al, 2019;Tavanaei et al, 2019). • This network has been widely adopted to study the performance of a biological SNN for different purposes, such as studying different STDP models (Diehl and Cook, 2015), improving STDP learning (Panda et al, 2018), and pruning (Shi et al, 2019;Guo et al, 2020b Most importantly, this study provides important understanding of the impact of different coding schemes on various aspects of the performance of a neuromorphic system. With this understanding, it is beneficial for neuromorphic system researchers to consider and select corresponding coding schemes to achieve specific design goals.…”
Section: Comparison and Discussionmentioning
confidence: 99%
“…However, directly removing neurons from the network could cause severe deterioration of network performance. We compared the proposed online weight pruning methods with an online adaptive neuron pruning method presented in our previous work ( Guo et al, 2020 ). The comparison is shown in Figure 13 .…”
Section: Comparisons and Discussionmentioning
confidence: 99%
“…This bit can be simply attached to the weight bits in the memory with very little overhead. The number of NAND gates for an SNN with 100 neurons was estimated according to the proposed digital implementation from Guo et al (2020) . Clearly, the number of equivalent NAND gates for the pruning algorithm is much smaller than that for the SNN.…”
Section: Comparisons and Discussionmentioning
confidence: 99%
See 1 more Smart Citation