2023
DOI: 10.3390/rs15153764
|View full text |Cite
|
Sign up to set email alerts
|

Lithological Classification by Hyperspectral Images Based on a Two-Layer XGBoost Model, Combined with a Greedy Algorithm

Abstract: Lithology classification is important in mineral resource exploration, engineering geological exploration, and disaster monitoring. Traditional laboratory methods for the qualitative analysis of rocks are limited by sampling conditions and analytical techniques, resulting in high costs, low efficiency, and the inability to quickly obtain large-scale geological information. Hyperspectral remote sensing technology can classify and identify lithology using the spectral characteristics of rock, and is characterize… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
0
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 64 publications
0
0
0
Order By: Relevance
“…PLSR, a traditional statistical method [37], and XGBoost, a powerful machine learning method [38], were both employed in this study as regression analysis techniques that are widely used in the field. A comparative analysis of the PLSR and XGBoost prediction models revealed that the XGBoost model achieved higher prediction accuracy and better robustness, which is consistent with the research findings of Lin et al [39].…”
Section: Discussionsupporting
confidence: 88%
“…PLSR, a traditional statistical method [37], and XGBoost, a powerful machine learning method [38], were both employed in this study as regression analysis techniques that are widely used in the field. A comparative analysis of the PLSR and XGBoost prediction models revealed that the XGBoost model achieved higher prediction accuracy and better robustness, which is consistent with the research findings of Lin et al [39].…”
Section: Discussionsupporting
confidence: 88%
“…The Newton method is used to solve the extreme value of the loss function, which is expanded to the second order using the Taylor formula. The loss function is optimized with the first-order gradient function and second-order gradient function to reduce model complexity [56]. Simultaneously, the probability of over-fitting is reduced through regularization, significantly improving the model's generalization ability.…”
Section: Extreme Gradient Boosting (Xg-boost)mentioning
confidence: 99%
“…The Newton method is used to solve the extreme value of the loss function, which is expanded to the second order using the Taylor formula. The loss function is optimized with the first-order gradient function and second-order gradient function to reduce the model complexity [53]. Simultaneously, the probability for over-fitting is reduced through regularization, significantly improving the model's generalization ability.…”
Section: Extreme Gradient Boosting (Xg-boost)mentioning
confidence: 99%