2020
DOI: 10.1007/s00500-020-04888-7
|View full text |Cite
|
Sign up to set email alerts
|

Application and research for electricity price forecasting system based on multi-objective optimization and sub-models selection strategy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 62 publications
0
2
0
Order By: Relevance
“…where α and α * are the variables linked with the constraints (dual-problem solving as described above), which can exhibit values greater than zero and less than the adjustment hyperparameter C. K(α i , α j ) refers to the kernel function application (well known as kernel trick), which satisfies Mercer's conditions and transforms the data with nonlinearity into higher-dimensional feature space where linear separation is possible [54].…”
Section: Support Vector Machine: Regression (Svr)mentioning
confidence: 99%
“…where α and α * are the variables linked with the constraints (dual-problem solving as described above), which can exhibit values greater than zero and less than the adjustment hyperparameter C. K(α i , α j ) refers to the kernel function application (well known as kernel trick), which satisfies Mercer's conditions and transforms the data with nonlinearity into higher-dimensional feature space where linear separation is possible [54].…”
Section: Support Vector Machine: Regression (Svr)mentioning
confidence: 99%
“…where α and α * are the dual variables associated with the constraints that can take values greater than zero and less than the penalty hyperparameter C, K(α i , α j ) corresponds to the application of a kernel function (well-known as kernel trick) that satisfies Mercer's conditions and transforms the nonlinear data into a higher dimensional feature space to make a linear separation possible [29]. For this case, a Gaussian radial basis function (G-RBF) [30] as expressed in Equation ( 5) was used:…”
Section: Support Vector Machine: Regression (Svr)mentioning
confidence: 99%