2020
DOI: 10.1007/978-3-030-58607-2_14
|View full text |Cite
|
Sign up to set email alerts
|

Thanks for Nothing: Predicting Zero-Valued Activations with Lightweight Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

3
6

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 25 publications
0
11
0
Order By: Relevance
“…The min-max statistics are gathered during a quick preprocessing stage on 2K randomly picked images from the training set. In addition, during preprocessing, we recalibrate the BatchNorm layers' running mean and running variance statistics [25,29,31,32]. In all models, the first convolution layer is left intact, since its input activations, which correspond to the image pixels, do not include many zero values, if any.…”
Section: Methodsmentioning
confidence: 99%
“…The min-max statistics are gathered during a quick preprocessing stage on 2K randomly picked images from the training set. In addition, during preprocessing, we recalibrate the BatchNorm layers' running mean and running variance statistics [25,29,31,32]. In all models, the first convolution layer is left intact, since its input activations, which correspond to the image pixels, do not include many zero values, if any.…”
Section: Methodsmentioning
confidence: 99%
“…Another representative work by the same authors [14] follows a different approach to exploit spatial correlation. They design a small CNN, called ZAP, to predict whether individual ReLU inputs in convolutional layers will be positive or negative.…”
Section: Relu Output Prediction Based On Spatial Correlationmentioning
confidence: 99%
“…[33] employs reinforcement learning to prune channels, and Refs. [34,35] leverage spatial correlations of CNN OFMs to predict and prune zero-value activations. Further pruning techniques based on weight magnitudes were recently introduced in Refs.…”
Section: Prior Workmentioning
confidence: 99%