2021
DOI: 10.1007/978-981-16-0167-5_17
|View full text |Cite
|
Sign up to set email alerts
|

XAI—An Approach for Understanding Decisions Made by Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…For a broader review of XAI methods, we refer the reader to [7], [23], [24], [25], [26]. Apart from the intrinsic explainable models, such as linear regression and the family of decision trees [27], the main variable importance XAI methods are: 1) Input permutation [12], [17]. The technique involves shuffling the values of one input feature and observing the effect on the model's prediction.…”
Section: State Of the Artmentioning
confidence: 99%
See 1 more Smart Citation
“…For a broader review of XAI methods, we refer the reader to [7], [23], [24], [25], [26]. Apart from the intrinsic explainable models, such as linear regression and the family of decision trees [27], the main variable importance XAI methods are: 1) Input permutation [12], [17]. The technique involves shuffling the values of one input feature and observing the effect on the model's prediction.…”
Section: State Of the Artmentioning
confidence: 99%
“…Some techniques are already available to estimate variable importance of NN models. The most commonly used are input permutation [12], which consists in perturbing the input data and observing the effect on the model's output and SHAP (SHapley Additive exPlanations) [13], which assigns an importance value to each feature by averaging over all possible coalition of features. These techniques have notable advantages: they do not depend of the topology of the ML model being analyzed (model-agnostic method) and provide quantifiable information of variable importance [14], [15].…”
Section: Introductionmentioning
confidence: 99%
“…For a broader review of XAI methods, we refer the reader to [7], [23], [24], [25], [26]. Apart from the intrinsic explainable models, such as linear regression and the family of decision trees [27], the main variable importance XAI methods are: 1) Input permutation [12], [17]. The technique involves shuffling the values of one input feature and observing the effect on the model's prediction.…”
Section: State Of the Artmentioning
confidence: 99%
“…Some techniques are already available to estimate variable importance of NN models. The most commonly used are input permutation [12], which consists in perturbing the input data and observing the effect on the model's output and SHAP (SHapley Additive exPlanations) [13], which assigns an importance value to each feature by averaging over all possible coalition of features. These techniques have notable advantages: they do not depend of the topology of the ML model being analyzed (model-agnostic method) and provide quantifiable information of variable importance [14], [15].…”
Section: Introductionmentioning
confidence: 99%