2023
DOI: 10.1016/j.engstruct.2022.115310
|View full text |Cite
|
Sign up to set email alerts
|

A comparative study on the most effective machine learning model for blast loading prediction: From GBDT to Transformer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…LightGBM [29]: LightGBM is the second mainstream Boosting algorithm implemented under the GBDT [30] framework since XGBoost [31]. LightGBM adopts a decision tree algorithm based on histograms, using GOSS single edge gradient sampling algorithm and mutually exclusive feature bundling algorithm (EOF) to improve model running speed, while supporting parallel training of class features and data, providing better parallel performance.…”
Section: B Comparison Methodsmentioning
confidence: 99%
“…LightGBM [29]: LightGBM is the second mainstream Boosting algorithm implemented under the GBDT [30] framework since XGBoost [31]. LightGBM adopts a decision tree algorithm based on histograms, using GOSS single edge gradient sampling algorithm and mutually exclusive feature bundling algorithm (EOF) to improve model running speed, while supporting parallel training of class features and data, providing better parallel performance.…”
Section: B Comparison Methodsmentioning
confidence: 99%
“…The transformer model's deep learning architecture uses attention processes to discover connections between input features. It comprises numerous feedforward and self-attention neural network layers, allowing the model to recognize intricate correlations between variables and produce precise predictions [59]. FTT has several advantages over more conventional methods.…”
Section: Classifiersmentioning
confidence: 99%