2022
DOI: 10.1016/j.cviu.2022.103511
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive CNN filter pruning using global importance metric

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(8 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…The choice of those criteria and partition rate is motivated by the results in Table 1 and Table 3 for VGGNet and ResNet models pruning on the CIFAR-10 dataset. Table 5 demonstrate the superiority of our method with several other state-of-the-art pruning methods such as SFP [13], FPGM [15], MetaPruning [51], MFP [16], LFC [52], FuPruner [53], GFI-AP [25] etc. when pruning ResNet on ImageNet dataset.…”
Section: Resnet On Imagenetmentioning
confidence: 85%
See 3 more Smart Citations
“…The choice of those criteria and partition rate is motivated by the results in Table 1 and Table 3 for VGGNet and ResNet models pruning on the CIFAR-10 dataset. Table 5 demonstrate the superiority of our method with several other state-of-the-art pruning methods such as SFP [13], FPGM [15], MetaPruning [51], MFP [16], LFC [52], FuPruner [53], GFI-AP [25] etc. when pruning ResNet on ImageNet dataset.…”
Section: Resnet On Imagenetmentioning
confidence: 85%
“…The weight or connection pruning eliminates the unimportant connections from the network resulting in an unstructured sparse network [10][11][12]28]. The filter or channel pruning eliminates the entire filter or channel retaining the original structure of the network [13,14,16,[24][25][26][27]29]. In contrast to filter pruning, sparse networks after weight pruning require specialized hardware or software in order to achieve practical acceleration.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Especially deep learning-based methods need high capacity computing resources due to the high number of parameters. In order to overcome this situation, some studies [22][23][24][25] should be carried out to reduce the number of parameters of deep learning architectures. In this study firstly, the state of the art deep learning models Googlenet [26], Alexnet [27], Vgg19 [28], resnet-50, and Resnet-101 [29] were used to achieve high accuracy in the classification of scenes in aerial images.…”
Section: Introductionmentioning
confidence: 99%