2020
DOI: 10.1007/s10853-020-05091-7
|View full text |Cite
|
Sign up to set email alerts
|

Predictions and mechanism analyses of the fatigue strength of steel based on machine learning

Abstract: It is not completely understood fatigue strength at this time due to its complex formation mechanism. Therefore, in order to address this issue, machine learning has been used to examine the important factors involved in predicting fatigue strength. In this study, a hybrid model was proposed based on the modified bagging method by combining XGBoost and LightGBM, in which the hyperparameters of the models were optimized by a grey wolf algorithm. Moreover, an interpretable method, referred to as Shapley additive… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 60 publications
(27 citation statements)
references
References 36 publications
0
27
0
Order By: Relevance
“…Tree‐based machine learning models are extremely popular nonlinear models for the prediction and attribution analysis of ecosystem dynamics (Green et al, 2022; Wang et al, 2015; Yuan et al, 2019), and they are usually more accurate than neural networks in many applications and outperform standard deep‐learning models on tabular‐style data sets (Lundberg et al, 2020). Extreme gradient boosting (XGB) is an ensemble learning algorithm based on an iterative decision tree model with many decision trees, which is typically used in classification and regression fields (Chen et al, 2016; Yan et al, 2020). This method can perform multithreaded calculations, use shrinkage technology to scale newly added weights at each boosting step to avoid overfitting (Meng et al, 2021).…”
Section: Methodsmentioning
confidence: 99%
“…Tree‐based machine learning models are extremely popular nonlinear models for the prediction and attribution analysis of ecosystem dynamics (Green et al, 2022; Wang et al, 2015; Yuan et al, 2019), and they are usually more accurate than neural networks in many applications and outperform standard deep‐learning models on tabular‐style data sets (Lundberg et al, 2020). Extreme gradient boosting (XGB) is an ensemble learning algorithm based on an iterative decision tree model with many decision trees, which is typically used in classification and regression fields (Chen et al, 2016; Yan et al, 2020). This method can perform multithreaded calculations, use shrinkage technology to scale newly added weights at each boosting step to avoid overfitting (Meng et al, 2021).…”
Section: Methodsmentioning
confidence: 99%
“…SHAP method is based on game theory and local explanations, satisfying the following properties: local accuracy, missingness, and consistence [66]- [68]. This method calculates the shapley value for variable i to estimate its contribution to model output (v(N)) using the formula [69]:…”
Section: ) Model Interpretationmentioning
confidence: 99%
“…Therefore, rationalized design parts, selecting materials, and formulating cold and heat resistant materials were deemed possible using the technique. 38 Micromechanical models accurately represented material behavior and its relationship to thermal history from the early stages of computational materials science. 39 However, when out-of-range data were used to train ML models, projected grain sizes varied significantly from reference values.…”
Section: Comparative Aspects Of the Current Studymentioning
confidence: 99%
“…According to the results, the SHAP approach showed much potential for evaluating fatigue strength indicators. Therefore, rationalized design parts, selecting materials, and formulating cold and heat resistant materials were deemed possible using the technique 38 . Micromechanical models accurately represented material behavior and its relationship to thermal history from the early stages of computational materials science 39 .…”
Section: Introductionmentioning
confidence: 99%