2021
DOI: 10.48550/arxiv.2106.07030
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The Backpropagation Algorithm Implemented on Spiking Neuromorphic Hardware

Abstract: The capabilities of natural neural systems have inspired new generations of machine learning algorithms as well as neuromorphic very large-scale integrated (VLSI) circuits capable of fast, low-power information processing. However, most modern machine learning algorithms are not neurophysiologically plausible and thus are not directly implementable in neuromorphic hardware. In particular, the workhorse of modern deep learning, the backpropagation algorithm, has proven difficult to translate to neuromorphic har… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 77 publications
0
7
0
Order By: Relevance
“…The neural dynamics for all neurons are modelled using the LIF model, which is a common representation of biological spiking neuron dynamics [8], [26], [37]. The differential equation describing the dynamics (the membrane voltage) of a LIF neuron is:…”
Section: Methodology: Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…The neural dynamics for all neurons are modelled using the LIF model, which is a common representation of biological spiking neuron dynamics [8], [26], [37]. The differential equation describing the dynamics (the membrane voltage) of a LIF neuron is:…”
Section: Methodology: Preliminariesmentioning
confidence: 99%
“…Due to the non-differentiable neuronal dynamics of spiking neurons, supervised techniques such as back-propagation cannot be applied in a straightforward manner. Recently, Renner et al [26] and Lee at al. [27] proposed algorithms that approximate backpropagation, and demonstrated use on Intel's Loihi neuromorphic hardware.…”
Section: A Spiking Neural Networkmentioning
confidence: 98%
“…In general, this motivates offline training of SNNs typically using GPUs, where deployment can take place on low-power SNN accelerators. Several recent studies have leveraged programmable microcode of neuromorphic research processors to adopt BPTT variants on a single chip [48,49] Training SNNs has traditionally been slow compared to ANNs. This is due to the additional sequential operations needed in recurrent networks, along with stateful computations implemented at the neuron node.…”
Section: Low-cost Inferencementioning
confidence: 99%
“…By potentiating GATING neurons, spikes emanating from the clock selectively enable information propagation only when and where needed. In this way, information is dynamically guided through the circuit with the appropriate processing occurring only at the appropriate times and locations for the computation to proceed (see [34] for another example of this sort of gating). We note that our implementation here assumes the regularity of our gating clock to organize timing (as may be done with the Loihi chip), however, even in a noisy circuit with spike jitter, this regular relative timing can be arranged with synfire-gated synfire chains made of up neuronal populations.…”
Section: Two's Complement Multiplicationmentioning
confidence: 99%