2018
DOI: 10.1007/978-3-319-96133-0_28
|View full text |Cite
|
Sign up to set email alerts
|

Flow Prediction Versus Flow Simulation Using Machine Learning Algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…The extreme gradient boosting (XGBoost) method is y for classification. XGBoost controls the overfitting by using the regularized model formalization, which resulted in better performance compared to the previous boosted algorithms [ 85 ]. XGBoost consists of a few hyperparameters such as nrounds (the number of trees), eta , learning rate and depth (the depth of the tree) [ 86 ] that can be optimized to improve the performance.…”
Section: Methodsmentioning
confidence: 99%
“…The extreme gradient boosting (XGBoost) method is y for classification. XGBoost controls the overfitting by using the regularized model formalization, which resulted in better performance compared to the previous boosted algorithms [ 85 ]. XGBoost consists of a few hyperparameters such as nrounds (the number of trees), eta , learning rate and depth (the depth of the tree) [ 86 ] that can be optimized to improve the performance.…”
Section: Methodsmentioning
confidence: 99%
“…XGBoost is an ensemble of decision trees; it consists of a sequential build-off from various decision trees where each tree works to enhance the performance of the prior tree. It is considered superior to previous boosting algorithms because it uses a more regularized model formulation, which allows for better control of overfitting and ultimately improved performance (Cisty and Soldanova, 2018). Huber et al (2022) have recently shown that a feature-based representation derived from remote sensing can outperform even the most recent deep learning-based yield predictors.…”
Section: Modelling Methodology 261 Yield Estimationmentioning
confidence: 99%
“…Net solar radiation, heat flux, moisture content, wind-speed, mean moisture, mean temperature [6] Multiple linear regression, random forest, extreme gradient boosting, and deep learning neural network Meteorological data (precipitation, temperatures and potential evapotranspiration) [7] Partial least square regression, and adaptive neuro fuzzy inference systems Soil moisture, soil temperature, rain fall, wind speed, crop evapotranspiration, radiation, dew point [8] Support vector regression, and SVR + K-means Moisture, temperature, UV, weather temperature, humidity, and precipitation computing, the highly voted predictions are selected as a final output. Random forest empowers an enormous set of weak classifiers to build a robust classifier [15].…”
Section: Methodsmentioning
confidence: 99%
“…Among these methods, the application of M5P regression trees, bagging, random forests, and regression support vectors to data from an experimental site in Central Florida according to [5]. Likewise, a comparison of two types of streamflow modeling was performed [6] using machine learning algorithms. The first one is based only on climatic data (precipitation, temperatures, and potential evapotranspiration), the second one integrates also the previous flows in the data entrees.…”
Section: Related Workmentioning
confidence: 99%