2016
DOI: 10.1016/j.apenergy.2015.09.087
|View full text |Cite
|
Sign up to set email alerts
|

Extended forecast methods for day-ahead electricity spot prices applying artificial neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
122
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 238 publications
(122 citation statements)
references
References 44 publications
0
122
0
Order By: Relevance
“…As has been noted in a number of studies, be it statistical or computational intelligence, a key point in EPF is the appropriate choice of explanatory variables [1,[4][5][6][7][8][9][10][11]. The typical approach has been to select predictors in an ad hoc fashion, sometimes using expert knowledge, seldom based on some formal validation procedures.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…As has been noted in a number of studies, be it statistical or computational intelligence, a key point in EPF is the appropriate choice of explanatory variables [1,[4][5][6][7][8][9][10][11]. The typical approach has been to select predictors in an ad hoc fashion, sometimes using expert knowledge, seldom based on some formal validation procedures.…”
Section: Introductionmentioning
confidence: 99%
“…Amjady and Keynia [4] proposed a feature selection algorithm that utilized the mutual information technique. (for later applications, see, e.g., [11,14,15]). In an econometric setup, Gianfreda and Grossi [5] computed p-values of the coefficients of a regression model with autoregressive fractionally integrated moving average disturbances (Reg-ARFIMA) and in one step eliminated all statistically-insignificant variables.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…By 'putting big data analytics to work'-as they referred to it-Ludwig et al [10] used random forests and the least absolute shrinkage and selection operator (LASSO or Lasso) as a feature selection algorithm to choose which of the 77 available weather stations were relevant. In parallel, in the machine learning literature, González et al [11] utilized random forests, while Keles et al [7] combined the k-Nearest-Neighbor algorithm with backward elimination to select the most appropriate inputs out of more than 50 fundamental parameters or lagged versions of these parameters.…”
Section: Introductionmentioning
confidence: 99%
“…In the machine learning literature, Amjady and Keynia [5] proposed a feature selection algorithm based on mutual information that was later utilized in [6,7], among others. On the other hand, Gianfreda and Grossi [8] used a very simple technique-single-step elimination of insignificant predictors in a regression setting.…”
Section: Introductionmentioning
confidence: 99%