2019
DOI: 10.1007/s00521-019-04387-3
|View full text |Cite
|
Sign up to set email alerts
|

Multi-step time series prediction intervals using neuroevolution

Abstract: Multi-step time series forecasting (TSF) is a crucial element to support tactical decisions (e.g., designing production or marketing plans several months in advance). While most TSF research addresses only single-point prediction, prediction intervals (PIs) are useful to reduce uncertainty related to important decision making variables. In this paper, we explore a large set of neural network methods for multi-step TSF and that directly optimize PIs. This includes multi-step adaptations of recently proposed PI … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 28 publications
0
4
0
Order By: Relevance
“…This can discard the redundant features (lasso) or decrease them to a minimal value (elastic); on the other hand, these still detect the interactions among features [81,82]. Many studies have compared the performance of the different methods with different classifiers and concluded that there is no such 'perfect feature selection method' for all problem types [75,[83][84][85][86][87]. That is why the feature selection method which was chosen for this study is based on the ensemble method, where the feature selection methods are combined so the different strengths can be combined [88].…”
Section: Feature Selectionmentioning
confidence: 99%
“…This can discard the redundant features (lasso) or decrease them to a minimal value (elastic); on the other hand, these still detect the interactions among features [81,82]. Many studies have compared the performance of the different methods with different classifiers and concluded that there is no such 'perfect feature selection method' for all problem types [75,[83][84][85][86][87]. That is why the feature selection method which was chosen for this study is based on the ensemble method, where the feature selection methods are combined so the different strengths can be combined [88].…”
Section: Feature Selectionmentioning
confidence: 99%
“…XGBoost is an integrated machine learning algorithm which is on the basis of the decision tree [9,10]. Not only has better accuracy, but also a regularization term is added to enhance the generalization effect of the model.…”
Section: Xgboost Modelmentioning
confidence: 99%
“…We set the number of MLP architectures used in the NN ensemble to N r = 7, which is also adopted in other multi-step forecasting experiments [69]. Each MLP in the ensemble follows the architecture described in Section 4.3.2 and was trained with 100 epochs of the BFGS algorithm.…”
Section: Modeling Setupmentioning
confidence: 99%