2022
DOI: 10.1109/jiot.2022.3190873
|View full text |Cite
|
Sign up to set email alerts
|

QSFM: Model Pruning Based on Quantified Similarity Between Feature Maps for AI on Edge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 28 publications
0
6
0
Order By: Relevance
“…In the scenario involving significant compression, with approximately 70% compression, CORING outperforms a recent SOTA method [34] in all aspects. FPAC [76] 93.66 0.39M(61.9) 113.08M(59.9) HRank-2 [38] 93.68 0.48M(53.8) 110.15M(61.0) EZCrop [40] 93.76 0.39M(61.9) 113.08M(59.9) DECORE-70 [1] 94.04 0.37M(65.0) 128.13M(54.7) CORING (Ours) 94.30 0.45M(57.3) 134.86M(53.5) QSFM-PSNR [73] 92.06 1.67M(25.4) 57.27M(39.4) DMC [15] 94.49 N/A 56.72M(40.0) SCOP [66] 94.24 1.43M(36.1) 56.44M(40.3) GFBS [45] 94.25 N/A 54.83M(42.0) CORING (Ours) 94.44 0.77M(65.6) 38.00M(60.0) DenseNet-40. Managing DenseNet architecture can be challenging because removing a single channel from the architecture requires removing that channel from all subsequent layers [1].…”
Section: Results and Analysismentioning
confidence: 99%
See 2 more Smart Citations
“…In the scenario involving significant compression, with approximately 70% compression, CORING outperforms a recent SOTA method [34] in all aspects. FPAC [76] 93.66 0.39M(61.9) 113.08M(59.9) HRank-2 [38] 93.68 0.48M(53.8) 110.15M(61.0) EZCrop [40] 93.76 0.39M(61.9) 113.08M(59.9) DECORE-70 [1] 94.04 0.37M(65.0) 128.13M(54.7) CORING (Ours) 94.30 0.45M(57.3) 134.86M(53.5) QSFM-PSNR [73] 92.06 1.67M(25.4) 57.27M(39.4) DMC [15] 94.49 N/A 56.72M(40.0) SCOP [66] 94.24 1.43M(36.1) 56.44M(40.3) GFBS [45] 94.25 N/A 54.83M(42.0) CORING (Ours) 94.44 0.77M(65.6) 38.00M(60.0) DenseNet-40. Managing DenseNet architecture can be challenging because removing a single channel from the architecture requires removing that channel from all subsequent layers [1].…”
Section: Results and Analysismentioning
confidence: 99%
“…Comparison. CORING is compared with 44 SOTAs in the fields of structured pruning [1,3,5,6,7,11,12,15,16,18,19,23,26,29,30,31,32,34,35,36,37,38,39,40,41,43,45,46,47,48,49,56,58,63,64,66,68,73,76,78,79,80,83,85]. For a fair comparison, all available baseline models are identical.…”
Section: Experimental Settingsmentioning
confidence: 99%
See 1 more Smart Citation
“…This is much faster than the average speed of ROI generation by an IPP deployed nearshore according to our experience (Li et al, 2022). In the future, the model can be light-weighted by network pruning (Molchanov et al, 2019), quantification (Wang et al, 2022), knowledge distillation (Yim et al, 2017) techniques to further reduce its demand on computation for training and inference. These will enable the deployment of IsPlanktonFE on a cloud computing platform to facilitate next-generation in situ real-time marine plankton observations.…”
Section: Impact On Marine Plankton Observationmentioning
confidence: 93%
“…In addition, computing the statistical information of the next layer and minimizing the feature reconstruction error is also a popular idea [21,26]. Different from methods that are based on only a single filter or feature, [27,23,28] remove redundant parts of a model by measuring the difference between the filters or features.…”
Section: Related Workmentioning
confidence: 99%