2021
DOI: 10.3389/fnins.2021.712667
|View full text |Cite
|
Sign up to set email alerts
|

Spiking Autoencoders With Temporal Coding

Abstract: Spiking neural networks with temporal coding schemes process information based on the relative timing of neuronal spikes. In supervised learning tasks, temporal coding allows learning through backpropagation with exact derivatives, and achieves accuracies on par with conventional artificial neural networks. Here we introduce spiking autoencoders with temporal coding and pulses, trained using backpropagation to store and reconstruct images with high fidelity from compact representations. We show that spiking au… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(16 citation statements)
references
References 41 publications
0
16
0
Order By: Relevance
“…Experimental results on a variety of datasets are often superior to the current SNN-based approach (Comşa et al, 2021 ) while superior to same-architecture ANN in some cases. In addition, VTSNN uses roughly 1/274 of the energy of ANN-based methods.…”
Section: Introductionmentioning
confidence: 86%
See 3 more Smart Citations
“…Experimental results on a variety of datasets are often superior to the current SNN-based approach (Comşa et al, 2021 ) while superior to same-architecture ANN in some cases. In addition, VTSNN uses roughly 1/274 of the energy of ANN-based methods.…”
Section: Introductionmentioning
confidence: 86%
“…Therefore, we propose a novel symmetric and undistorted encoding-decoding method to fill the above gaps. Currently, researchers generally use STBP for the low-level SNN task (Comşa et al, 2021 ), which allows information to propagate in both temporal and spatial domains. Therefore, we present a new backpropagation that only permits information to propagate via the spatial domain.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Neurons in the brain communicate and compute using discrete sparse signals called spikes. This mechanism is radically different from current mainstream artificial neural networks as known as deep neural networks [1][2][3]. Although deep neural networks are most potent in image processing tasks, it is said that their energy efficiency should be improved for edge computing [4].…”
Section: Introductionmentioning
confidence: 95%