2022
DOI: 10.3390/buildings12060734
|View full text |Cite
|
Sign up to set email alerts
|

Interpretation of Machine-Learning-Based (Black-box) Wind Pressure Predictions for Low-Rise Gable-Roofed Buildings Using Shapley Additive Explanations (SHAP)

Abstract: Conventional methods of estimating pressure coefficients of buildings retain time and cost constraints. Recently, machine learning (ML) has been successfully established to predict wind pressure coefficients. However, regardless of the accuracy, ML models are incompetent in providing end-users’ confidence as a result of the black-box nature of predictions. In this study, we employed tree-based regression models (Decision Tree, XGBoost, Extra-tree, LightGBM) to predict surface-averaged mean pressure coefficient… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 29 publications
(5 citation statements)
references
References 75 publications
0
5
0
Order By: Relevance
“…XAI has emerged as a crucial area of research to enhance AI systems’ transparency, accountability, and trustworthiness. XAI aims to mitigate the gap between the “black-box” nature of many ML models and the necessity for understandable explanations of their decision-making processes 39 , 40 . This is especially relevant in vital fields such as healthcare, finance, and legal systems, where the ability to understand and justify AI-driven decisions is essential.…”
Section: Methodsmentioning
confidence: 99%
“…XAI has emerged as a crucial area of research to enhance AI systems’ transparency, accountability, and trustworthiness. XAI aims to mitigate the gap between the “black-box” nature of many ML models and the necessity for understandable explanations of their decision-making processes 39 , 40 . This is especially relevant in vital fields such as healthcare, finance, and legal systems, where the ability to understand and justify AI-driven decisions is essential.…”
Section: Methodsmentioning
confidence: 99%
“…Complex model structures usually improve the accuracy of ML modeling compared to a simple model. However, a model with complex structures cannot be easily explained 40 . Post-hoc explanation methods are necessary in such cases to provide human-understandable explanations for the predictions made by complex ML models 41 .…”
Section: Methods and Datamentioning
confidence: 99%
“…The model tries to minimize the error through each step by following the decision tree structure. XGBoost is capable of controlling its error 37 . Categorical Boost Algorithm (CatBoost)—CatBoost follows the same approach used in the XGBoost and it is a very powerful regression model.…”
Section: Study Area and Datasetmentioning
confidence: 99%
“…The model tries to minimize the error through each step by following the decision tree structure. XGBoost is capable of controlling its error 37 .…”
Section: Study Area and Datasetmentioning
confidence: 99%