2018 ACM/IEEE 45th Annual International Symposium on Computer Architecture (ISCA) 2018
DOI: 10.1109/isca.2018.00061
|View full text |Cite
|
Sign up to set email alerts
|

SnaPEA: Predictive Early Activation for Reducing Computation in Deep Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
76
0
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 135 publications
(78 citation statements)
references
References 25 publications
1
76
0
1
Order By: Relevance
“…Ganax [58] uses a SIMD-MIMD architecture to support DNNs and generative models. Snapea [59] employs early termination to skip computations. Instruction Sets for DNNs.…”
Section: Related Workmentioning
confidence: 99%
“…Ganax [58] uses a SIMD-MIMD architecture to support DNNs and generative models. Snapea [59] employs early termination to skip computations. Instruction Sets for DNNs.…”
Section: Related Workmentioning
confidence: 99%
“…In contrast, this paper proposes an efficient model for classification of Sigfox, LoRA, IEEE 802. 15 [10]. This, however, also requires expensive and energy consuming high-end hardware to process such a vast amount of data at a high speed.…”
Section: Related Workmentioning
confidence: 99%
“…Compared to the related work, this model offers 75 to 580 times less parameters. Another way to specify the model's computational footprint is to look at the amount of multiply-accumulate (MAC) operations in the convolutional layers [15]. Moreover, a linear relationship between the MAC count and energy consumption of CNN models exists which can help to compare models suitable for embedded devices [16].…”
Section: B Model Architecturementioning
confidence: 99%
“…Output sparsity In addition to exploiting input neuron sparsity, x t and h t , prior work exploits sparsity in output neurons [14], [64]. This requires predicting output neurons that will be masked by ReLU.…”
Section: A Design Space Explorationmentioning
confidence: 99%