2021
DOI: 10.1109/access.2021.3085216
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating Spike-by-Spike Neural Networks on FPGA With Hybrid Custom Floating-Point and Logarithmic Dot-Product Approximation

Abstract: Spiking neural networks (SNNs) represent a promising alternative to conventional neural networks. In particular, the so-called Spike-by-Spike (SbS) neural networks provide exceptional noise robustness and reduced complexity. However, deep SbS networks require a memory footprint and a computational cost unsuitable for embedded applications. To address this problem, this work exploits the intrinsic error resilience of neural networks to improve performance and to reduce hardware complexity. More precisely, we de… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 40 publications
0
2
0
Order By: Relevance
“…IaF networks are biologically more realistic than SbS networks, but require orders of magnitude more computational power than SbS [10]. Furthermore, it is possible to construct very efficient hardware accelerators for SbS [11]. A Github with our PyTorch SbS and NNMF framework can be found at https://github.com/davrot/pytorch-sbs…”
Section: Discussionmentioning
confidence: 99%
“…IaF networks are biologically more realistic than SbS networks, but require orders of magnitude more computational power than SbS [10]. Furthermore, it is possible to construct very efficient hardware accelerators for SbS [11]. A Github with our PyTorch SbS and NNMF framework can be found at https://github.com/davrot/pytorch-sbs…”
Section: Discussionmentioning
confidence: 99%
“…We implement the floating-point computation adopting the dot-product with hybrid custom floating-point [29]. The hardware dot-product is illustrated in Fig.…”
Section: ) Dot-product With Hybrid Floating-pointmentioning
confidence: 99%