2014
DOI: 10.5120/16315-5553
|View full text |Cite
|
Sign up to set email alerts
|

A Study of Applications of RBF Network

Abstract: Forecasting is a method of making statements about certain event whose actual results have not been observed. It seems to be an easy process but is actually not. It requires a lot of analysis on current and past outcomes in order to give timely and accurate timely forecasted results.Radial Basis Function (RBF) is a method proposed in machine learning for making predictions and forecasting. It has been used in various real time applications such as weather forecasting, load forecasting, forecasting about number… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0
2

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 19 publications
(8 citation statements)
references
References 19 publications
0
6
0
2
Order By: Relevance
“…These are all listed in the Supplement (Part II, Section 1). Specifically, we selected 10 predictor algorithms: 1) LOLIMOT (Local Linear Model Trees) [35,36], 2) RBF (Radial basis Function) [37], 3) MLP-BP (Multilayer Perceptron-Back propagation) [38,39], 4) LASSOLAR (Least Absolute Shrinkage and Selection Operator -Least Angle Regression) [40,41], 5) RFA (Random Forest Algorithm) [42,43], 6) RNN (Recurrent Neural Network) [44,45], 7) BRR (Bayesian Ridge Regression) [46][47][48], 8) DTC (Decision Tree Classification) [49][50][51], 9) PAR (Passive Aggressive Regression) [52][53][54], 10) Thiel-Sen Regression [55-57] and 11) ANFIS (Adaptive neuro fuzzy inference system) [58,59]. In this work, we automatically adjusted intrinsic parameters such as the number of neurons and number of layers in the predictor algorithms etc.…”
Section: Predictor Algorithms and Utilizing Automated Machine Learninmentioning
confidence: 99%
“…These are all listed in the Supplement (Part II, Section 1). Specifically, we selected 10 predictor algorithms: 1) LOLIMOT (Local Linear Model Trees) [35,36], 2) RBF (Radial basis Function) [37], 3) MLP-BP (Multilayer Perceptron-Back propagation) [38,39], 4) LASSOLAR (Least Absolute Shrinkage and Selection Operator -Least Angle Regression) [40,41], 5) RFA (Random Forest Algorithm) [42,43], 6) RNN (Recurrent Neural Network) [44,45], 7) BRR (Bayesian Ridge Regression) [46][47][48], 8) DTC (Decision Tree Classification) [49][50][51], 9) PAR (Passive Aggressive Regression) [52][53][54], 10) Thiel-Sen Regression [55-57] and 11) ANFIS (Adaptive neuro fuzzy inference system) [58,59]. In this work, we automatically adjusted intrinsic parameters such as the number of neurons and number of layers in the predictor algorithms etc.…”
Section: Predictor Algorithms and Utilizing Automated Machine Learninmentioning
confidence: 99%
“…Yapılan deneylerde, geliştirilen CCN tabanlı modellerin performansını iyileştiren parametrelerin gizli katmandaki çekirdek fonksiyonunun tipi ve havuzdaki aday nöronlarının RBF, biyolojik sinir hücrelerinde görülen etki-tepki davranışlarından esinlenilerek geliştirilmiş bir yöntemdir. Genel yapay sinir ağları mimarisine benzer şekilde giriş katmanı, gizli katman ve çıktı katmanı olmak üzere 3 katman halinde tanımlanmaktadır [11]. Geliştirilen RBF tabanlı modellerin performansını etkileyen en önemli parametreler, gizli katmandaki maksimum nöron sayısı, RBF'in yarıçapı ve regülasyon katsayısı ( λ ) olarak bulunmuştur.…”
Section: Yöntemunclassified
“…Una red neuronal de tipo Feedforward entrenadas con método de Backpropagation o BPNN (Ahmad, 2012;Kamboj & Avtar, 2013) es una de las arquitecturas más utilizadas por su capacidad de extraer relaciones complejas y no lineales de los datos. Por otro lado, las Redes Neuronales de Función de Base Radial o RBFNN tienen una arquitectura especial y pueden ser utilizadas para la predicción (Arora et al, 2014;Jimenez et al, 2017). Todos estos métodos tienen ciertos inconvenientes, como la difícil parametrización y la posibilidad de que se produzca un sobreajuste u Overfitting (Borges et al, 2013).…”
Section: Introductionunclassified