2019 IEEE Winter Conference on Applications of Computer Vision (WACV) 2019
DOI: 10.1109/wacv.2019.00134
|View full text |Cite
|
Sign up to set email alerts
|

AddressNet: Shift-Based Primitives for Efficient Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 29 publications
(28 citation statements)
references
References 17 publications
0
28
0
Order By: Relevance
“…Inference Latency: We also evaluate the inference time of our improved VGG-16 Faster R-CNN on a single GTX 1080 Ti GPU with CUDA 8 and CUDNN 6, as it is crucial for resource-limited applications [51,20,23,19,32]. AP AP 50 AP 60 AP 70 AP 80 AP 90 baseline [14] 38 Table 2, our approach only increases 2ms latency on GPU.…”
Section: Soft-nmsmentioning
confidence: 99%
“…Inference Latency: We also evaluate the inference time of our improved VGG-16 Faster R-CNN on a single GTX 1080 Ti GPU with CUDA 8 and CUDNN 6, as it is crucial for resource-limited applications [51,20,23,19,32]. AP AP 50 AP 60 AP 70 AP 80 AP 90 baseline [14] 38 Table 2, our approach only increases 2ms latency on GPU.…”
Section: Soft-nmsmentioning
confidence: 99%
“…Other works have also investigated varying shifts operations for image classification. Closely related to this work is that of [11], who vary the (discrete) neighbourhood of shifts for miniature images. They then build a compact model for large images (accuracy 67.0%), though the FLOPs and parameters of this model were not shown.…”
Section: Related Workmentioning
confidence: 99%
“…However, active-shifts require additional FLOPs to calculate these interpolations. Further, as activation map motion is non-integer, active shifts always require additional activation map copies in any implementation [11]. [16] also focus on optimising network architectures for miniature image datasets and for computationally constrained models on ImageNet, while we go beyond small datasets and compact networks for shifting.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…FE-Net [4] develops a novel component called Sparse Shift Layer (SSL) to suppress redundant shift operations by adding displacement penalty during optimization. Three shift-based primitives presented in AddressNet [13] significantly reduce the inference time by minimizing the memory copy of shift operations. Meanwhile, the work [21] proposes Temporal Shift Module (TSM) to efficiently model both temporal and spatial information.…”
Section: Shift-based Networkmentioning
confidence: 99%