2021
DOI: 10.1007/978-3-030-86383-8_15
|View full text |Cite
|
Sign up to set email alerts
|

SiamSNN: Siamese Spiking Neural Networks for Energy-Efficient Object Tracking

Abstract: Although deep neural networks (DNNs) have achieved fantastic success in various scenarios, it's difficult to employ DNNs on many systems with limited resources due to their high energy consumption. It's well known that spiking neural networks (SNNs) are attracting more attention due to the capability of energy-efficient computing. Recently many works focus on converting DNNs into SNNs with little accuracy degradation in image classification on MNIST, CIFAR-10/100. However, few studies on shortening latency, an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 55 publications
0
8
0
Order By: Relevance
“…In this way, events are treated as spikes that can be handled directly by SNN (Jiang et al, 2021 ). SiamSNN (Luo et al, 2021 ), the deep SNN for object tracking, uses the model converted from SiamFC and achieves low precision loss on the benchmarks. But SiamSNN is not directly trained with SNN, it is trained using the conversion algorithm with pretrained ANN.…”
Section: Introductionmentioning
confidence: 99%
“…In this way, events are treated as spikes that can be handled directly by SNN (Jiang et al, 2021 ). SiamSNN (Luo et al, 2021 ), the deep SNN for object tracking, uses the model converted from SiamFC and achieves low precision loss on the benchmarks. But SiamSNN is not directly trained with SNN, it is trained using the conversion algorithm with pretrained ANN.…”
Section: Introductionmentioning
confidence: 99%
“…where S is the matrix of the size (N Â NÞ containing the sparse coefficients of X. On the other hand, the image can be represented through the measurements Y with fewer samples as follows 29 :…”
Section: Energy Efficient Compressive Sensing Methodsmentioning
confidence: 99%
“…Therefore, X is obtained from the measurements with the size of MitalicKlogN as shown in the following formula 7 : bold-italicXbold-italicgoodbreak=boldΨbold-italicS, where bold-italicS is the matrix of the size ( N×Ntrue) containing the sparse coefficients of bold-italicX. On the other hand, the image can be represented through the measurements bold-italicY with fewer samples as follows 29 : bold-italicYbold-italicgoodbreak=bold-italicϕXbold-italicgoodbreak=bold-italicϕboldΨbold-italicSbold-italicgoodbreak+bold-italicW, where bold-italicW is the additive Gaussian white noise matrix while bold-italicϕ is a random measurement matrix of size ( M×Ntrue). Note that if bold-italicS is K‐sparse and M>italicKlogN, then bold-italicX can be reconstructed by solving the l1 norm minimization problem as follows 32 : argminbold-italicXl1,s.t.||bold-italicϕXgoodbreak−bold-italicYl2<ϵ, where l1 and l2 are the corresponding norms in Equation ().…”
Section: Energy Efficient Compressive Sensing Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Limited computational resources constrain many application scenarios of downstream vision tasks, and the low-power property of SNNs is well-suited. Currently, SNNs have been applied to several tasks, such as object detection [ 13 , 29 , 50 , 51 , 52 ], optical flow estimation [ 53 , 54 , 55 ], and object tracking [ 56 , 57 ]. Reference [ 58 ] is the first and currently the only SNN work on semantic segmentation.…”
Section: Related Workmentioning
confidence: 99%