2019
DOI: 10.1109/tc.2019.2924215
|View full text |Cite
|
Sign up to set email alerts
|

SqueezeFlow: A Sparse CNN Accelerator Exploiting Concise Convolution Rules

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(33 citation statements)
references
References 43 publications
0
33
0
Order By: Relevance
“…The first comes without any clock cycle waste, while the second allows acting directly on compressed data thanks to its hardware-friendly representation. SqueezeFlow [226] exploits a different approach by introducing concise convolutional rules. Such rules reduce the computation by avoiding part of the useless operations (null values).…”
Section: Run Length Coding (Rlc)mentioning
confidence: 99%
“…The first comes without any clock cycle waste, while the second allows acting directly on compressed data thanks to its hardware-friendly representation. SqueezeFlow [226] exploits a different approach by introducing concise convolutional rules. Such rules reduce the computation by avoiding part of the useless operations (null values).…”
Section: Run Length Coding (Rlc)mentioning
confidence: 99%
“…In order for JAPARI to support such accelerators that do not follow the channel-major computation order, the footprint appending (Section 4.3) and footprint representation (Section 4.4) schemes need to be appropriately adapted to the used order. Recent custom accelerators have focused on new strategies such as exploiting model sparsity [26,48] and variable bitwidths [61,62] to further improve inference performance and energy eiciency. Accelerators that exploit model sparsity rely on a special irregular data format to represent sparse matrices.…”
Section: Support For Other Acceleratormentioning
confidence: 99%
“…ZeNA [72] is the first zero-aware accelerator that can skip the operations with null results induced by both null weights and activations. SqueezeFlow [25] has a mathematical approach to the sparsity problem and introduces the concise convolution rules to avoid the operations with a null result. The RLC scheme is applied to the weights.…”
Section: Accelerators With Sparsity Exploitationmentioning
confidence: 99%
“…Finally, the sparsity is a technique actively used to eliminate unnecessary operations and to further lower the power envelope of the accelerators, as well as making them faster and more effective. This also instantiates the need for sparse DNN accelerators like [24][25][26].…”
Section: Introductionmentioning
confidence: 97%