Proceedings of the 29th ACM International Conference on Multimedia 2021
DOI: 10.1145/3474085.3475474
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Gradient Flow Based Saliency for DNN Model Compression

Abstract: Model pruning aims to reduce the deep neural network (DNN) model size or computational overhead. Traditional model pruning methods such as ℓ 1 pruning that evaluates the channel significance for DNN pay too much attention to the local analysis of each channel and make use of the magnitude of the entire feature while ignoring its relevance to the batch normalization (BN) and ReLU layer after each convolutional operation. To overcome these problems, we propose a new model pruning method from a new perspective of… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 51 publications
0
4
0
Order By: Relevance
“…Pruning MobileNet-v2 presents a significant challenge due to its exceedingly low computational cost. However, CORING exhibits superior performance compared to other candidate methods [45,66], achieving a top-1 accuracy of 94.81% while pruning 42% of the network's MACs, as illustrated in Table 5. Even when compressing more than 60% of the network, the accuracy is not reduced, which suggests that CORING can be applied to optimize the hand-crafted design networks.…”
Section: Results and Analysismentioning
confidence: 99%
See 3 more Smart Citations
“…Pruning MobileNet-v2 presents a significant challenge due to its exceedingly low computational cost. However, CORING exhibits superior performance compared to other candidate methods [45,66], achieving a top-1 accuracy of 94.81% while pruning 42% of the network's MACs, as illustrated in Table 5. Even when compressing more than 60% of the network, the accuracy is not reduced, which suggests that CORING can be applied to optimize the hand-crafted design networks.…”
Section: Results and Analysismentioning
confidence: 99%
“…4, illustrating the effectiveness of CORING. [31] LAASP (2023) [16] HRel (2022) [56] DECORE (2022) [1] EZCrop (2022) [40] FPFS (2022) [79] WSP (2021) [18] CHIP (2021) [63] GFBS (2021) [45] HRank (2020) [38] Figure 4: Pruning methods for VGG-16 baseline on CIFAR-10.…”
Section: Results and Analysismentioning
confidence: 99%
See 2 more Smart Citations