Multi-layer neural networks of the backpropagation type (MLP-networks) became a well-established tool used in various application areas. Reliable solutions require, however, also sufficient generalization capabilities of the formed networks and an easy interpretation of their function. These characteristics are strongly related to less sensitive networks with an optimized network structure.In this paper, we will introduce a new pruning technique called SCGSIR that is inspired by the fast method of scaled conjugate gradients (SCG) and sensitivity analysis. Network sensitivity inhibited during training impacts efficient optimization of network structure. Experiments performed so far yield promising results outperforming the reference techniques when considering both their ability to find networks with optimum architecture and improved generalization.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.