2017 IEEE Manchester PowerTech 2017
DOI: 10.1109/ptc.2017.7980814
|View full text |Cite
|
Sign up to set email alerts
|

A comparison of artificial neural networks and support vector machines for short-term load forecasting using various load types

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(15 citation statements)
references
References 7 publications
0
14
0
1
Order By: Relevance
“…The mean prediction error for daily peak load in [24] was achieved 4.65% for weekdays and 7.08% for weekends of three different states of Turkey [35]. This mean prediction error was achieved after smoothing the temperature discrepancies throughout the day.…”
Section: Related Workmentioning
confidence: 81%
See 1 more Smart Citation
“…The mean prediction error for daily peak load in [24] was achieved 4.65% for weekdays and 7.08% for weekends of three different states of Turkey [35]. This mean prediction error was achieved after smoothing the temperature discrepancies throughout the day.…”
Section: Related Workmentioning
confidence: 81%
“…Moreover, every proposed model has incorporated some techniques. For example, regression based processes are usually comprised of Autoregressive Integrated Moving Average (ARIMA) [22], Auto-Regressive Moving Average (ARMA) [23], Support Vector Regression (SVR) [24], and Auto-Regressive Moving Average with Exogenous variable (ARMAX) [25]. Nevertheless, it is essential for aforementioned techniques to learn the process by bulks of preceding data for tuning of various parameters.…”
Section: Related Workmentioning
confidence: 99%
“…In the iterative training process of DNN, a reasonable weight coefficient ω s is multiplied when calculating the parameter variation Δθ (k) . In this way, the different effects of different samples on parameter adjustment can be reflected through ω s , as shown in Equations (17) to (19).…”
Section: Sample Weight Assignment Methods Based On Distancementioning
confidence: 99%
“…List of Symbols and Abbreviations: a l , the output of the l-th layer; W l , the weight matrix connecting the (l-1)-th and l-th layers; b l , is the bias of the l-th layer; s(x), the activation function; m, the number of samples; dθ(k), is the gradient of θ at the k-th update; θ (k -1) , is the coefficient before the k-th update and r θ (k − 1) , O is the partial derivative of the θ' s objective function at the k-th update; r (k) , the moving average of the squared gradient of the parameters at the k-th update; ρ, the decay rate; Δθ(k), is the change amount of parameter at the k-th update; α (k) , is the learning rate at the k-th update; θ (k) , a parameter after the k-th update; ε, the decay rate of the learning rate; ω s , the weight coefficients corresponding to the s-th sample; Δθ (k)s , the variation of parameters with sample weights of the s-th sample; Δθ (k)w , the variation of parameters with sample weights at the k-th update. average, 9,10 wavelet analysis, 11,12 and so on; intelligent methods, such as support vector machine (SVM), 13,14 random forest, [15][16][17] neural networks, [18][19][20] and other intelligent forecasting methods. Other methods, such as hybrid technology, [21][22][23][24] are a combination of more than one technology, that is, a combination of traditional and intelligent method technologies or a combination of different intelligent methods.…”
Section: Auto-regressive Movingmentioning
confidence: 99%
“…Based on literature review, one can conclude that time series analyses are not effective in highly volatile data [29], and therefore time series methods such as regression models, ARIMA models, GARCH, and hybrid models such as the combination of ARIMA and GARCH using wavelet transform are not considered for short-term forecasting when dealing with volatile data [25,27]. Rather than these, the machine learning techniques like artificial neural networks, support vector machines, or random forests are recently applied in this area with positive outcome [30][31][32][33].…”
Section: Measure Distancementioning
confidence: 99%