2019 10th International Conference on Information, Intelligence, Systems and Applications (IISA) 2019
DOI: 10.1109/iisa.2019.8900711
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Pruning of CNN networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 2 publications
0
2
0
Order By: Relevance
“…This method outperforms the state-of-the-art pruning methods in terms of parameters and FLOPs when it is tested on CIFAR dataset for VGG-16, ResNet-20 and DenseNet-40 models. A dynamic channel pruning method is presented in [20], which adds channel-wise sparsity at the output of each convolutional layer by introducing Learning Kernel-Activation switch module (LKAM). This module activates and de-activates each kernel dynamically depending upon the input content.…”
Section: A Network Compression Using Pruning Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This method outperforms the state-of-the-art pruning methods in terms of parameters and FLOPs when it is tested on CIFAR dataset for VGG-16, ResNet-20 and DenseNet-40 models. A dynamic channel pruning method is presented in [20], which adds channel-wise sparsity at the output of each convolutional layer by introducing Learning Kernel-Activation switch module (LKAM). This module activates and de-activates each kernel dynamically depending upon the input content.…”
Section: A Network Compression Using Pruning Methodsmentioning
confidence: 99%
“…This whole process is carried out before applying any compression method. With the proposed network compression scheme, we first compress network using Soft Filter Pruning (SFP) and evaluate system for different values of pruning ratios (P%) including 10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95. The model's accuracy and error have been determined across each pruning ratio and their learning curves are shown in Figure 8.…”
Section: Evaluation Of Soft Filter Pruning (Sfp) Algorithmmentioning
confidence: 99%