2019
DOI: 10.1109/tvcg.2018.2864475
|View full text |Cite
|
Sign up to set email alerts
|

iForest: Interpreting Random Forests via Visual Analytics

Abstract: As an ensemble model that consists of many independent decision trees, random forests generate predictions by feeding the input to internal trees and summarizing their outputs. The ensemble nature of the model helps random forests outperform any individual decision tree. However, it also leads to a poor model interpretability, which significantly hinders the model from being used in fields that require transparent and explainable predictions, such as medical diagnosis and financial fraud detection. The interpr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
99
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 152 publications
(99 citation statements)
references
References 30 publications
0
99
0
Order By: Relevance
“…Others focus on specific elements, such as the neuron activation [34], hidden states of a cell [48] or action patterns of reinforcement learning algorithms [79] to allow model-specific diagnosis. Finally, some systems visualize the dataflow [43] and decision paths [86] taken by the model to enable a model diagnosis during the training process. While all these approaches allow for an integrated diagnosis, they fall short of addressing the identified issues in a subsequent refinement step.…”
Section: Interactive Machine Learning and Visual Analyticsmentioning
confidence: 99%
“…Others focus on specific elements, such as the neuron activation [34], hidden states of a cell [48] or action patterns of reinforcement learning algorithms [79] to allow model-specific diagnosis. Finally, some systems visualize the dataflow [43] and decision paths [86] taken by the model to enable a model diagnosis during the training process. While all these approaches allow for an integrated diagnosis, they fall short of addressing the identified issues in a subsequent refinement step.…”
Section: Interactive Machine Learning and Visual Analyticsmentioning
confidence: 99%
“…This is an open access post-print version; the final authenticated version is available online at https://link.springer.com/chapter/10.1007/978-3-030-57321-8_18 by © IFIP International Federation for Information Processing 2020. [22,36,43,44,55,68,71,72,80,88,89,92,93,95,96,97,98,101,102,103,104,105,106,107] 24 Yes…”
Section: Usage Of Scenarios For Requirements Elicitation For Explanatmentioning
confidence: 99%
“…In terms of decision tree-based models, BaobabView [61] proposes a natural visual representation of decision tree structures where decision criterion are visualized in the tree nodes. BOOSTVis [36] and iForest [70] also focus on explaining tree ensemble models through the use of multiple coordinated views to help explain and explore decision paths. Similarly, recent visual analytics work on deep learning [24,25,30,34,44,49,55,[63][64][65]68] tackles the issue of the low interpretability of neural network structures and supports revealing the internal logic of the training and prediction processes.…”
Section: Explainable Artificial Intelligence -Xaimentioning
confidence: 99%