2021
DOI: 10.1016/j.conbuildmat.2021.123642
|View full text |Cite
|
Sign up to set email alerts
|

An eXtreme Gradient Boosting model for predicting dynamic modulus of asphalt concrete mixtures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(6 citation statements)
references
References 42 publications
0
6
0
Order By: Relevance
“…The extreme gradient boosting (XGBoost) model is an ML algorithm that belongs to the decision-tree-based model category. It was specifically developed for high computational and prediction accuracy and efficiency by combining a wide range of gradient-boosted decision trees [53]. XGBoost adopts ensemble learning that utilizes the sequence of decision trees on which each decision tree depends and learns from the previous decision tree to develop a strong learning process that improves the performance of the developed models [53][54][55][56].…”
Section: Machine Learning Analysis and Modelingmentioning
confidence: 99%
See 2 more Smart Citations
“…The extreme gradient boosting (XGBoost) model is an ML algorithm that belongs to the decision-tree-based model category. It was specifically developed for high computational and prediction accuracy and efficiency by combining a wide range of gradient-boosted decision trees [53]. XGBoost adopts ensemble learning that utilizes the sequence of decision trees on which each decision tree depends and learns from the previous decision tree to develop a strong learning process that improves the performance of the developed models [53][54][55][56].…”
Section: Machine Learning Analysis and Modelingmentioning
confidence: 99%
“…It was specifically developed for high computational and prediction accuracy and efficiency by combining a wide range of gradient-boosted decision trees [53]. XGBoost adopts ensemble learning that utilizes the sequence of decision trees on which each decision tree depends and learns from the previous decision tree to develop a strong learning process that improves the performance of the developed models [53][54][55][56]. The independent variables x i and specifying dataset of n observations are typically used to develop XGBoost, where each of the independent variables has m unique features.…”
Section: Machine Learning Analysis and Modelingmentioning
confidence: 99%
See 1 more Smart Citation
“…Some ML-based models used numerous input parameters, such as Ali [20]. Extreme gradient-boosting regression (EGBR) was used with a dataset of 1152 E* measurements to predict E* as a function of 24 variables representing testing conditions, mix volumetric properties, and gradation parameters with R 2 -value of 0.835.…”
Section: Literature Reviewmentioning
confidence: 99%
“…To characterize and simulate the LVE behavior of HMA in the temperature and frequency domains, researchers have developed various rheological models, e.g., Huet-Sayegh model, 2S2P1D model [6,7,20], Generalized Maxwell (GM) model [21], Generalized Kelvin (GK) model, Christensen-Anderson (CA) model, Sigmoidal model [22,23], Modified Christensen-Anderson-Marasteanu (CAM) model, Havriliak-Negami (HN) model [24], and machine learning models [25][26][27]. These models can be classified into three categories: physical models, mathematical shape functions, and machine learning models.…”
Section: Introductionmentioning
confidence: 99%