2022
DOI: 10.3389/fnins.2022.760298
|View full text |Cite
|
Sign up to set email alerts
|

Backpropagation With Sparsity Regularization for Spiking Neural Network Learning

Abstract: The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity. Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy. Backpropagation real… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(6 citation statements)
references
References 52 publications
0
6
0
Order By: Relevance
“…Previous studies have developed methods that suppress the firing of SNNs in the framework of the surrogate gradient method [26,27]. They applied direct regularization to the spike variable s(t) ∈ {0, 1} represented at each time step in the model to the time-discretized SNN.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Previous studies have developed methods that suppress the firing of SNNs in the framework of the surrogate gradient method [26,27]. They applied direct regularization to the spike variable s(t) ∈ {0, 1} represented at each time step in the model to the time-discretized SNN.…”
Section: Discussionmentioning
confidence: 99%
“…( 2), which integrates the membrane potential. By contrast, M-SSR, unlike the previous method [26,27], can be transformed from the time-integration form to the timing form by setting v → V th . This may correspond to the fact that the learning method with the surrogate function can transition to a timing-like learning method by taking a limit [19].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Here the magnitude of synaptic plasticity is assumed by the rate of pre-and post-synaptic firing over the specific time period. This type of model is typically used for converting conventional artificial neural networks to spiking neural networks by using backpropagation [55] [56]. The ANN is trained with backpropagation technique and then it is converted into an equivalent SNN by relating the activation of ANN units and the firing rate of spiking neurons.…”
Section: ) Rate-basedmentioning
confidence: 99%
“…(Alawad et al, 2017) uses stochastic neurons to increase energy efficiency during inference. More recently (Yan et al, 2022) uses regularization during the training in order to increase the sparsity of spikes which reduces computational burden and energy consumption. (Cramer et al, 2022) performs the forward pass on a neuromorphic chip, while the backward pass still takes place on a standard GPU.…”
Section: Introductionmentioning
confidence: 99%