2022
DOI: 10.1145/3514253
|View full text |Cite
|
Sign up to set email alerts
|

SyncNN: Evaluating and Accelerating Spiking Neural Networks on FPGAs

Abstract: Compared to conventional artificial neural networks, Spiking Neural Networks (SNNs) are more biologically plausible and require less computation due to their event-driven nature of spiking neurons. However, the default asynchronous execution of SNNs also poses great challenges to accelerate their performance on FPGAs. In this work, we present a novel synchronous approach for rate encoding based SNNs, which is more hardware friendly than conventional asynchronous app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(2 citation statements)
references
References 34 publications
0
2
0
Order By: Relevance
“…On the other hand, SNNs' temporal information processing nature also presents challenges, leading to extended inference times and, in turn, limiting SNNs from achieving the expected high energy efficiency. This is exceptionally notable in conventional non-event-based tasks, where even though software SNNs have achieved state-of-the-art accuracies [4,5,6,7,8,9], hardware SNN accelerators [10,11,12,13,14,15] still fall short in energy efficiency compared to their CNN counterparts [16,17,18], preventing SNNs from being adopted as a general efficient solution.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, SNNs' temporal information processing nature also presents challenges, leading to extended inference times and, in turn, limiting SNNs from achieving the expected high energy efficiency. This is exceptionally notable in conventional non-event-based tasks, where even though software SNNs have achieved state-of-the-art accuracies [4,5,6,7,8,9], hardware SNN accelerators [10,11,12,13,14,15] still fall short in energy efficiency compared to their CNN counterparts [16,17,18], preventing SNNs from being adopted as a general efficient solution.…”
Section: Introductionmentioning
confidence: 99%
“…Recent iterations of these systems target a broader range of tasks for example in the area of Machine Learning, and feature higher degrees of flexibility and efficiency (Mayr et al, 2019 ; Billaudelle et al, 2020 ). In addition, recent dedicated systems exist that target the simulation at higher degrees of abstraction (Wang et al, 2018 ) or aim at solving Machine Learning tasks (Panchapakesan et al, 2022 ). On one side, the mentioned variety together with advances in computational capabilities and the development of simulator-independent model description languages (e.g., PyNN by Davison et al, 2009 ) pushed the domain of computational neuroscience to study neural network models of increasing complexity.…”
Section: Introductionmentioning
confidence: 99%