2022
DOI: 10.7717/peerj-cs.880
|View full text |Cite
|
Sign up to set email alerts
|

GPUTreeShap: massively parallel exact calculation of SHAP scores for tree ensembles

Abstract: SHapley Additive exPlanation (SHAP) values (Lundberg & Lee, 2017) provide a game theoretic interpretation of the predictions of machine learning models based on Shapley values (Shapley, 1953). While exact calculation of SHAP values is computationally intractable in general, a recursive polynomial-time algorithm called TreeShap (Lundberg et al., 2020) is available for decision tree models. However, despite its polynomial time complexity, TreeShap can become a significant bottleneck in practical machine lear… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
27
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(27 citation statements)
references
References 17 publications
0
27
0
Order By: Relevance
“…Out of the plethora of extrinsic methods, two methods and their open-source packages have found regular application for engineering cases. The first one is called Local Interpretable Model-Agnostic Explanations (LIME) [41] and the other is called Shapely Additive Explanations (SHAP) [42][43][44][45]. These approaches are developed to give local interpretability to data.…”
Section: Application Of Explainable Ai In Materials Sciencementioning
confidence: 99%
See 1 more Smart Citation
“…Out of the plethora of extrinsic methods, two methods and their open-source packages have found regular application for engineering cases. The first one is called Local Interpretable Model-Agnostic Explanations (LIME) [41] and the other is called Shapely Additive Explanations (SHAP) [42][43][44][45]. These approaches are developed to give local interpretability to data.…”
Section: Application Of Explainable Ai In Materials Sciencementioning
confidence: 99%
“…One advantage of using SHAP values is their additive nature, that is the summing of the SHAP values for each of the individual inputs results in the difference between the model prediction and model base value (explained further in Section 2). For more details on the SHAP derivation and implementation, authors would like to direct the readers to previous literature [42][43][44][45]. There have been many applications of SHAP analysis in the material science community.…”
Section: Application Of Explainable Ai In Materials Sciencementioning
confidence: 99%
“…Although data-driven modeling (based on compositional features) has been applied to predict a variety of mechanical properties, [3][4][5][6] corrosion and oxidation behavior are often considered a challenging problem due to the inherent complexity of these physical processes. [7,8] Although a consistent emphasis was allocated toward building models with high accuracy using ML models such as neural network, NN [9][10][11] for different applications, interpretability of these models remains as a consistent limitation. One of the challenges of using data-driven modeling in material science is the scarcity of data which will be persistent until high-throughput experimental methods become ubiquitous.…”
Section: Introductionmentioning
confidence: 99%
“…The domain of Explainable AI (XAI) and Interpretable ML (IML) [16] was conceived to resolve this challenge. Methods and tools, such as SHapely Additive exPlanations (SHAP) [17] and Local Interpretable Model-Agnostic Explanations (LIME) [18], were developed to extend predictive modeling to possess local level interpretability, so that the models can be veri ed, studied and analyzed with the support of domain knowledge. For this work, the SHAP approach is applied as the XAI tool to provide explainability to a black-box NN model.…”
Section: Introductionmentioning
confidence: 99%
“…This makes the application and interpretation of SHAP values more intuitive. For a more details on the SHAP derivation and implementation, authors would like to direct the reader to [17,19,20]. There have also been some applications of SHAP analysis in the material science community [5,[21][22][23].…”
Section: Introductionmentioning
confidence: 99%