2020
DOI: 10.3389/fnins.2020.00423
|View full text |Cite
|
Sign up to set email alerts
|

On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices

Abstract: Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing system with very low power consumption and massively parallel operation. To train SNNs with supervision, we propose an efficient on-chip training scheme approximating backpropagation algorithm suitable for hardware implementation. We show that the accuracy of the proposed scheme for SNNs is close to that of conventional artificial neural networks (ANNs) by using the stochastic character… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 41 publications
(18 citation statements)
references
References 51 publications
(69 reference statements)
0
18
0
Order By: Relevance
“…Analog computation paves the way to achieve Tera operations per second per watt efficiency which is 100x compared to digital implementations ( Gil and Green, 2020 ). Various types of analog devices have been used to implement neural networks, such as CMOS transistors ( Indiveri et al, 2011 ), floating gate transistors ( Kim et al, 2018a ), gated Schottky diode ( Kwon et al, 2020 ), and emerging memory devices (like PCM, RRAM, STTRAM) ( Kim et al, 2018b ). Despite the potential of these devices in analog computation, they suffer from many non-idealities which could limit the performance, such as limited precision, programming variability, stuck-at-fault (SAF) defects, retention and others ( Fouda et al, 2019 ).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Analog computation paves the way to achieve Tera operations per second per watt efficiency which is 100x compared to digital implementations ( Gil and Green, 2020 ). Various types of analog devices have been used to implement neural networks, such as CMOS transistors ( Indiveri et al, 2011 ), floating gate transistors ( Kim et al, 2018a ), gated Schottky diode ( Kwon et al, 2020 ), and emerging memory devices (like PCM, RRAM, STTRAM) ( Kim et al, 2018b ). Despite the potential of these devices in analog computation, they suffer from many non-idealities which could limit the performance, such as limited precision, programming variability, stuck-at-fault (SAF) defects, retention and others ( Fouda et al, 2019 ).…”
Section: Resultsmentioning
confidence: 99%
“… Vatajelu E. et al (2019) reported and analyzed different generic fault models that could exist in SNN hardware implementation. We chose the synaptic SAF model in this study since it appears very often in hardware, especially in the promising and newly emerging analog devices, and it has a profound impact on hardware performance ( El-Sayed et al, 2020 ; Kwon et al, 2020 ; Zhang B. et al, 2020 ). A SAF device has its conductance state fixed at either a high or low conductance state.…”
Section: Resultsmentioning
confidence: 99%
“…Many efforts have been made to apply the gradient descent-based backpropagation algorithm to the SNN’s learning to compensate for this issue ( Lee et al, 2016 ). Also, using analog resistive devices, on-chip training SNNs with backpropagation algorithms has been recently reported ( Kwon et al, 2020 ).…”
Section: Resultsmentioning
confidence: 99%
“…Recent research on SNN and its implementations using resistive memory devices have shown that the devices are designed explicitly to be embedded with the learning algorithm for SNN such as Spike-timing-dependent plasticity (STDP) and equilibrium propagation (Scellier and Bengio, 2017 ). Accordingly, the update asymmetry of devices is more likely to affect the process of training neural networks (Kwon et al, 2020 ). Inspired by this motivation, they demonstrated that asymmetric non-linear devices are more powerful than symmetric linear devices (Brivio et al, 2018 , 2021 ; Kim et al, 2021 ).…”
Section: Impact Of Update Asymmetries On Neural Network Performancementioning
confidence: 99%