2024
DOI: 10.1109/access.2021.3082519
|View full text |Cite
|
Sign up to set email alerts
|

Ensembles of Gradient Boosting Recurrent Neural Network for Time Series Data Prediction

Abstract: This work was supported in part by the Situation analysis and demonstration application of saffron cultivation and soil nutrients based on big data mining technology 201900014, Provincial Project in Data Fusion Analysis and Intelligent Regulation of Saffron Growth Environment Monitoring LGN19C130002 ABSTRACT Ensemble deep learning can combine strengths of neural network and ensemble learning and gradually becomes a new emerging research direction. However, the existing methods either lack theoretical support o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“…Some traditional machine learning techniques, such as random forest, SVM (support vector machine), Bayesian networks, and logistic regression, have been employed to improve predictive performance in identifying early clinical deterioration [ 27 ]. However, these traditional models are not optimized for handling the unique characteristics of time series data, such as autocorrelation, seasonality, and trend patterns [ 28 , 29 ].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Some traditional machine learning techniques, such as random forest, SVM (support vector machine), Bayesian networks, and logistic regression, have been employed to improve predictive performance in identifying early clinical deterioration [ 27 ]. However, these traditional models are not optimized for handling the unique characteristics of time series data, such as autocorrelation, seasonality, and trend patterns [ 28 , 29 ].…”
Section: Related Workmentioning
confidence: 99%
“…Gradient boosting models are alternatives to specialized models, such as long short-term memory network (LSTM) and gated recurrent unit (GRU) [ 31 , 32 ]. Although these models are not ideal for time series forecasting, they are still generally better suited for handling sequential data compared to non-sequential algorithms (such as random forest, SVM, logistic regression, and naive Bayes) [ 29 ].…”
Section: Related Workmentioning
confidence: 99%
“…Stacking is one of the usual processes in combining models and training an ensemble model on the classification, enabling convenient experimentation and model comparisons. Even though Light Gradient Boosting is faster when compared with the performance it is equivalent [71][72][73]. The framework of the developed method is presented Fig.…”
Section: F Extreme Gradient Boostingmentioning
confidence: 99%