2003
DOI: 10.1007/s00521-003-0377-9
|View full text |Cite
|
Sign up to set email alerts
|

Numeric sensitivity analysis applied to feedforward neural networks

Abstract: During the last 10 years different interpretative methods for analysing the effect or importance of input variables on the output of a feedforward neural network have been proposed. These methods can be grouped into two sets: analysis based on the magnitude of weights; and sensitivity analysis. However, as described throughout this study, these methods present a series of limitations. We have defined and validated a new method, called Numeric Sensitivity Analysis (NSA), that overcomes these limitations, provin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

2
63
0
1

Year Published

2010
2010
2019
2019

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 111 publications
(66 citation statements)
references
References 18 publications
2
63
0
1
Order By: Relevance
“…Her calculation takes into account that weights can be positive or negative, and gives a better measure of contribution of inputs to outputs. Numerical sensitivity analysis is proposed in Montao and Palmer (2003) to interpret the effect of input variables on output without making an assumption about the nature of the data. More recently, Paliwal and Kumar (2011) propose a method based on the interquartile range of the distribution of the network weights obtained from training the network.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Her calculation takes into account that weights can be positive or negative, and gives a better measure of contribution of inputs to outputs. Numerical sensitivity analysis is proposed in Montao and Palmer (2003) to interpret the effect of input variables on output without making an assumption about the nature of the data. More recently, Paliwal and Kumar (2011) propose a method based on the interquartile range of the distribution of the network weights obtained from training the network.…”
Section: Related Workmentioning
confidence: 99%
“…The effect indicates the significance of input on class output. Sensitivity analysis is calculated by taking the partial derivatives of input I k with respect to output O l (Montao and Palmer, 2003;Engelbrecht and Cloete, 1998) as follows: (4) Where y n is a hidden unit activation, w nl is the weight between hidden n and output l, and w kn is the weight between input unit k and hidden unit n.…”
Section: Sensitivity Analysismentioning
confidence: 99%
“…Perceiving an ANN not only as a surrogate model to approximate a functional relationship f , but rather as an versatile tool to reason the behavior of f , several characteristics of ANN can be capitalized for sensitivity analysis [4]. Principally, these are the data storage (to memorize the characteristic of f ), derivability (to determine partial derivatives analytically), the learning aptitude (to train a set of initial information), and efficient numerical evaluation.…”
Section: Sensitivity Analysis With Artificial Neural Networkmentioning
confidence: 99%
“…Principally, these are the data storage (to memorize the characteristic of f ), derivability (to determine partial derivatives analytically), the learning aptitude (to train a set of initial information), and efficient numerical evaluation. This enables to deduce weighting based (web), derivative based (deb), training based (trb) and variance based (vab) sensitivity measures [3,4].…”
Section: Sensitivity Analysis With Artificial Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation