2018
DOI: 10.1007/978-3-319-98131-4_5
|View full text |Cite
|
Sign up to set email alerts
|

Structuring Neural Networks for More Explainable Predictions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
7
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 22 publications
1
7
0
Order By: Relevance
“…The open problem of explainability is reflected in a lot of recent work (Kindermans et al, 2017;Selvaraju et al, 2017;Bach et al, 2015;Zhang et al, 2018a;Zhou et al, 2016;Ancona et al, 2018;Ribeiro et al, 2016;Rieger et al, 2018;Kim et al, 2018;Lundberg & Lee, 2017;Zintgraf et al, 2017;Simonyan et al, 2013;Zeiler & Fergus, 2014;Selvaraju et al, 2017;Smilkov et al, 2017;Sundararajan et al, 2017;Shrikumar et al, 2017;Montavon et al, 2017;Chang et al, 2018). We focus on generating visual explanations for single samples.…”
Section: Explanation Methodsmentioning
confidence: 99%
“…The open problem of explainability is reflected in a lot of recent work (Kindermans et al, 2017;Selvaraju et al, 2017;Bach et al, 2015;Zhang et al, 2018a;Zhou et al, 2016;Ancona et al, 2018;Ribeiro et al, 2016;Rieger et al, 2018;Kim et al, 2018;Lundberg & Lee, 2017;Zintgraf et al, 2017;Simonyan et al, 2013;Zeiler & Fergus, 2014;Selvaraju et al, 2017;Smilkov et al, 2017;Sundararajan et al, 2017;Shrikumar et al, 2017;Montavon et al, 2017;Chang et al, 2018). We focus on generating visual explanations for single samples.…”
Section: Explanation Methodsmentioning
confidence: 99%
“…Fine differences can, however, be observed between the methods: for example, LRP performs better on VGG-16 than on ResNet-50. This can be explained by VGG-16 having a more explicit structure (standard pooling operations for VGG-16 versus strided convolution for ResNet-50), which better supports the process of relevance propagation (see also [149] for a discussion of the effect of structure on the performance of explanation methods).…”
Section: End For Return Pfcurvementioning
confidence: 99%
“…Clearly, the first linear term R g is zero for the nearest root point, thus, no relevance will be redistributed to the gate, however, the saturation effect of the hyperbolic tangent can create a mismatch between the first-order term, and the function value to redistribute. However, if replacing in the LSTM the hyperbolic tangent by the identity or the ReLU nonlinearity (as this was done, for example, in [59]), then we get an exact decomposition of the relevance model with (R g , R s ) = (0, R p ), since the Taylor remainder is exactly zero in this case. This corresponds to the LRP-all redistribution rule.…”
Section: Gated Interactionsmentioning
confidence: 99%
“…A more complete set of propagation rules that have been used in practice [2,4,14,57,59,77], and that we consider in our experiments, is given in Table 1. In addition to the definitions provided in Table 1, in order to avoid near zero division, one may add a stabilizing term into the denominator of the LRPprop and LRP-abs variants, similarly to the epsilon-rule stabilization for linear mappings.…”
Section: Gated Interactionsmentioning
confidence: 99%