The 2011 International Joint Conference on Neural Networks 2011
DOI: 10.1109/ijcnn.2011.6033493
|View full text |Cite
|
Sign up to set email alerts
|

A new sensitivity-based pruning technique for feed-forward neural networks that improves generalization

Abstract: Multi-layer neural networks of the backpropagation type (MLP-networks) became a well-established tool used in various application areas. Reliable solutions require, however, also sufficient generalization capabilities of the formed networks and an easy interpretation of their function. These characteristics are strongly related to less sensitive networks with an optimized network structure.In this paper, we will introduce a new pruning technique called SCGSIR that is inspired by the fast method of scaled conju… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2011
2011
2018
2018

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…In this work, we propose to selectively prune each network parameter using the knowledge of sen-sitivity. Engelbrecht et al [19] and Mrazova et al [20,21] previously proposed sensitivity-based strategies for learning sparse architectures. In their work, the sensitivity is however defined as the variation of the network output with respect to a variation of the network inputs.…”
Section: Related Workmentioning
confidence: 99%
“…In this work, we propose to selectively prune each network parameter using the knowledge of sen-sitivity. Engelbrecht et al [19] and Mrazova et al [20,21] previously proposed sensitivity-based strategies for learning sparse architectures. In their work, the sensitivity is however defined as the variation of the network output with respect to a variation of the network inputs.…”
Section: Related Workmentioning
confidence: 99%