2022
DOI: 10.1007/s13042-022-01624-5
|View full text |Cite
|
Sign up to set email alerts
|

CorrNet: pearson correlation based pruning for efficient convolutional neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“…Accomplishing these objectives necessitates the computational costs reduction and memory requirements which aids in broadening the deep learning models applicability and being able to employ in a wide variety of applications namely embedded systems, real-time applications, and mobile devices. Although, several approaches available in the literature regarding coping with compressing CNNs have been introduced recently [32,33], pruning in this field emerges as a famous solution which is eliminating redundant weights out of initial networks. While techniques related to pruning were conceived in early era of the 1980s-1990s, and were able to be utilized in deep learning networks [32].…”
Section: Literature Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Accomplishing these objectives necessitates the computational costs reduction and memory requirements which aids in broadening the deep learning models applicability and being able to employ in a wide variety of applications namely embedded systems, real-time applications, and mobile devices. Although, several approaches available in the literature regarding coping with compressing CNNs have been introduced recently [32,33], pruning in this field emerges as a famous solution which is eliminating redundant weights out of initial networks. While techniques related to pruning were conceived in early era of the 1980s-1990s, and were able to be utilized in deep learning networks [32].…”
Section: Literature Workmentioning
confidence: 99%
“…In the meantime, these methodologies cease to anticipate filter redundancy during pruning. To remove the redundant feature maps in [33] proposed an approach based on the correlation between feature maps that are generated out of corresponding filter. This technique eliminates the redundant feature maps to aid in reducing the size of a model along with reduced computational cost plus being able to save many FLOPs too.…”
Section: Literature Workmentioning
confidence: 99%
“…Thus, it is wise to calculate the statistic of each signal over time, namely the SLTVSFs, in order to better evaluate the non-stationary random signal. With the separated signal time-frequency spectrum, we can calculate the SLTVSFs via the weighted sliding statistics method [23]. Four SLTVSFs are used in this paper, the average, standard deviation, maximum, and bandwidth, which are defined as follows:…”
Section: Feature Similarity Decision Criterionmentioning
confidence: 99%