2022
DOI: 10.1007/978-3-031-08421-8_20
|View full text |Cite
|
Sign up to set email alerts
|

A Relevance-Based CNN Trimming Method for Low-Resources Embedded Vision

Abstract: A significant amount of Deep Learning research deals with the reduction of network complexity. In most scenarios the preservation of very high performance has priority over size reduction. However, when dealing with embedded systems, the limited amount of resources forces a switch in perspective. In fact, being able to dramatically reduce complexity could be a stronger requisite for overall feasibility than excellent performance. In this paper we propose a simple to implement yet effective method to largely re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…The fully-connected layer produced an output tensor of size 3 (one-hot encoding of the 3 classes) through softmax activation. With a simple pruning procedure, similar to [ 53 ], we reduced the size of the network, removing unnecessary kernels. The algorithm ranks the filters based on their output (after the activation) when predicting a random sample of the training set.…”
Section: Resultsmentioning
confidence: 99%
“…The fully-connected layer produced an output tensor of size 3 (one-hot encoding of the 3 classes) through softmax activation. With a simple pruning procedure, similar to [ 53 ], we reduced the size of the network, removing unnecessary kernels. The algorithm ranks the filters based on their output (after the activation) when predicting a random sample of the training set.…”
Section: Resultsmentioning
confidence: 99%