2020
DOI: 10.48550/arxiv.2003.03033
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

What is the State of Neural Network Pruning?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
207
0
4

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 135 publications
(216 citation statements)
references
References 0 publications
5
207
0
4
Order By: Relevance
“…This is due to the long time it takes to train a neural network, large data sets, and large number of parameters in them. Some recent studies [1,2,3] showed that many weights are excessive and can be removed without loss (or with a small loss) of neural network performance.…”
Section: Intorductionmentioning
confidence: 99%
“…This is due to the long time it takes to train a neural network, large data sets, and large number of parameters in them. Some recent studies [1,2,3] showed that many weights are excessive and can be removed without loss (or with a small loss) of neural network performance.…”
Section: Intorductionmentioning
confidence: 99%
“…After the attack, some neurons of the poisoned model contained the adverse effect. The intuition of Pruning [120] is to remove those for sanitizing the model. The authors in [121] applied Pruning to an FL environment, successfully preventing Poisoning attacks.…”
Section: B Defending Integrity and Availabilitymentioning
confidence: 99%
“…Specifically, if the sparsity pattern obtained by iterative magnitude-based pruning is applied to the NN with all non-pruned weights reset to their initial values, the resulting sparse NN can be trained to achieve the same or even better performance as the original dense one. A multitude of follow-up works explore this research direction (see the recent review article [15] for a list of references). However, how to find a method to obtain a working sparsity pattern for a NN without first training the dense NN remains an open research question.…”
Section: A Pruning and Sparsitymentioning
confidence: 99%