2023
DOI: 10.1016/j.jhydrol.2023.129599
|View full text |Cite
|
Sign up to set email alerts
|

Critical role of climate factors for groundwater potential mapping in arid regions: Insights from random forest, XGBoost, and LightGBM algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(13 citation statements)
references
References 99 publications
0
5
1
Order By: Relevance
“…The spatiotemporal variations in soil salinization were analyzed. These methods all belong to ensemble learning, which is a technique that can improve the generalization ability and robustness of a single learner by combining the prediction results of multiple base learners [38]. These methods have been used by numerous studies to estimate soil salinization, whose results indicate that these methods have high accuracy and strong robustness [39][40][41][42].…”
Section: Methodsmentioning
confidence: 99%
“…The spatiotemporal variations in soil salinization were analyzed. These methods all belong to ensemble learning, which is a technique that can improve the generalization ability and robustness of a single learner by combining the prediction results of multiple base learners [38]. These methods have been used by numerous studies to estimate soil salinization, whose results indicate that these methods have high accuracy and strong robustness [39][40][41][42].…”
Section: Methodsmentioning
confidence: 99%
“…LightGBM is a widely used boosting method that has fixed the drawback of GBR when handling large datasets. The process incorporated gradient-based one-side sampling (GOSS) and exclusive feature bundling (EFB) techniques to aid in training without compromising model accuracy and performance, as outlined by [39]. The GOSS algorithm selectively excludes a significant portion of data instances with small gradients, focusing solely on those that contribute significantly to information gain.…”
Section: Light Gradient Boosting Regression (Lightgbm)mentioning
confidence: 99%
“…Then, for the remaining set A c , which constitutes (1 − a) × 100% instances with smaller gradients, a new subset B is formed through random sampling with a size of b × |A c |. Finally, the variance gain Ṽj (d) is employed to split data instances which can be expressed as follows [39,40]:…”
Section: Light Gradient Boosting Regression (Lightgbm)mentioning
confidence: 99%
“…It can handle both classification and regression problems. It is very popular among researchers since it can handle very non-linear datasets and demonstrate good performance in most of cases [85,86]. In fact, it is an ensemble ML algorithm that makes an estimation by the combination of the results of multiple weak learners like the decision tree method [87].…”
Section: Xgboostmentioning
confidence: 99%