2022
DOI: 10.1016/j.cemconcomp.2021.104295
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable Ensemble-Machine-Learning models for predicting creep behavior of concrete

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
56
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 164 publications
(57 citation statements)
references
References 77 publications
0
56
0
1
Order By: Relevance
“…First presented by Lloyd Shapley, it uses Shapley values to interpret the model's output 16 , 23 , 40 , 41 . The Shapley value of a feature is equal to the difference between the average prediction value of samples with and without this feature 42 . It measures the feature's importance in the model 43 , 44 .…”
Section: Methodsmentioning
confidence: 99%
“…First presented by Lloyd Shapley, it uses Shapley values to interpret the model's output 16 , 23 , 40 , 41 . The Shapley value of a feature is equal to the difference between the average prediction value of samples with and without this feature 42 . It measures the feature's importance in the model 43 , 44 .…”
Section: Methodsmentioning
confidence: 99%
“…Therefore, CART will search through all possible values of all variables in the dataset for the best partition pair that minimizes the cost function of Eq. ( 4) [23]. The steps of BaggedCART can be summarized as follows:…”
Section: Bagged Cartmentioning
confidence: 99%
“…Machine learning predictions would help design such complex materials. Indeed, artificial intelligence techniques have been successfully applied to several Civil Engineering problems such as concrete strength prediction [40,41], creep prediction [42][43][44], crack assessment in structures [45] or durability and microstructural properties such as surface chloride concentration [46] and and mechanical properties of stabilized soil [47,48]. Among the various techniques developed, ensemble machine learning algorithms applied to datasets with hundreds of data points have proved a good accuracy and robustness against overfitting risk, often associated to conventional techniques and neural networks.…”
Section: Introductionmentioning
confidence: 99%