2022
DOI: 10.1016/j.chemolab.2022.104572
|View full text |Cite
|
Sign up to set email alerts
|

Explainable predictive modeling for limited spectral data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(4 citation statements)
references
References 62 publications
0
4
0
Order By: Relevance
“…Shapley Additive Explanations (SHAP) is considered a popular state-of-the-art approach to explaining machine learning models. As a result, SHAP calculates the contribution of each feature to the target value and the SHAP method can be used to analyze the prediction for both classification and regression models [ 21 ]. The SHAP value of each feature within the SVM model helps to select which feature is important and which bands are responsible for discrimination by the best algorithm between non-diabetic individuals and uncontrolled type 2 diabetic patients.…”
Section: Resultsmentioning
confidence: 99%
“…Shapley Additive Explanations (SHAP) is considered a popular state-of-the-art approach to explaining machine learning models. As a result, SHAP calculates the contribution of each feature to the target value and the SHAP method can be used to analyze the prediction for both classification and regression models [ 21 ]. The SHAP value of each feature within the SVM model helps to select which feature is important and which bands are responsible for discrimination by the best algorithm between non-diabetic individuals and uncontrolled type 2 diabetic patients.…”
Section: Resultsmentioning
confidence: 99%
“…To complement the performance of the first model the Shapley Additive Explanations (SHAP) was obtained. The SHAP is an approach that calculates the contribution of each feature to the target value [Akulich et al 2022]. This SHAP value obtained for each feature helps to understand which bands in ATR-FTIR are the most important to discriminate non-diabetic individuals and type 2 diabetic patients.…”
Section: Interpretability Analysismentioning
confidence: 99%
“…This study distinguishes between global and local as well as model-based and model-agnostic explanation methods. 22 This is attributed to the fact that features contributing to classification may not be those representative of a tissue from a biological point of view. As such, a feature may be considered important despite contributing to an incorrect classification.…”
Section: Explainabilitymentioning
confidence: 99%