[Proceedings 1992] IJCNN International Joint Conference on Neural Networks
DOI: 10.1109/ijcnn.1992.287175
|View full text |Cite
|
Sign up to set email alerts
|

Sensitivity analysis for feedforward artificial neural networks with differentiable activation functions

Abstract: A method for computing the network output sensitivities with respect to variations in the inputs for multilayer feedforward artificial neural networks with differentiable activation functions is presented. It is applied to obtain expressions for the first and second order sensitivities. An example is introduced along with a discussion to illustrate how the sensitivities are calculated and to show how they compare to the actual derivatives of the function being modeled by the neural network. 0-7803-0559-0 /92 $… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
46
0
1

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 76 publications
(47 citation statements)
references
References 10 publications
0
46
0
1
Order By: Relevance
“…(Hashem, 1992). Since these derivatives may take both positive and negative values, they may compensate and produce an average near zero.…”
Section: First-order Methodsmentioning
confidence: 99%
“…(Hashem, 1992). Since these derivatives may take both positive and negative values, they may compensate and produce an average near zero.…”
Section: First-order Methodsmentioning
confidence: 99%
“…Importantly, computational techniques for determining first-order partial derivatives of certain ANNs have been available for some time. One such technique, outlined by Hashem (1992), involves the application of a simple backward chaining partial differentiation rule. His general rule is adapted in Eq.…”
Section: Enabling the Ddmmf For Ann Models: Revealing Mechanistic Behmentioning
confidence: 99%
“…The network input and output are x and the learned version of ϕ(x) defined in (27), respectively. There is also one extra output corresponding to the sensitivity of the cost function, i.e., ∂ϕ/∂x, [20].…”
Section: A Learning and Optimizationmentioning
confidence: 99%
“…The first order output derivative of the MMF networks, i.e., ∂a 1 (2)/∂a 1 (0), can be calculated by applying a backward chaining partial differentiation rule that is described in detail in [20]:…”
Section: A Learning and Optimizationmentioning
confidence: 99%