2023
DOI: 10.1109/tpami.2023.3241201
|View full text |Cite
|
Sign up to set email alerts
|

Attention Spiking Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
26
0
2

Year Published

2024
2024
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 70 publications
(33 citation statements)
references
References 46 publications
0
26
0
2
Order By: Relevance
“…Further, this is achieved by utilizing merely around 20% of the computation (in FLOPs) required by the SOTA methods. Furthermore, our proposed method also outperforms SOTA SNN baselines [24], [46], [47] in this regression task of human pose tracking.…”
mentioning
confidence: 75%
See 4 more Smart Citations
“…Further, this is achieved by utilizing merely around 20% of the computation (in FLOPs) required by the SOTA methods. Furthermore, our proposed method also outperforms SOTA SNN baselines [24], [46], [47] in this regression task of human pose tracking.…”
mentioning
confidence: 75%
“…To avoid confusion, it is important to note that the spiking transformers presented in [21], [58] are not SNN-based transformers, but rather ANN-based or mixed models. The two recent works [46], [47] are most related to our proposed spiking spatiotemporal transformer. In MA-SNN [46], multi-dimensional attention is proposed in an SNNs framework, yet this attention is instead based on real values of membrane potentials, thus in a sense violating the efficiency design of SNNs.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations