2006
DOI: 10.1002/scj.20507
|View full text |Cite
|
Sign up to set email alerts
|

An online learning algorithm with dimension selection using minimal hyper basis function networks

Abstract: SUMMARYIn this study, the authors propose an HMRAN (Hyper MRAN) for which dimension selection is possible, by extending a minimal resource allocating network (MRAN) for online learning with minimized resources, with the application of a growth strategy and pruning strategy for hidden units in a Gaussian radial basis function (GRBF) network. When the input is a multidimensional pattern, it is often the case that dimensions with an explicit relationship to the output are limited to a single section and it is dif… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2007
2007
2020
2020

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…To overcome this problem with MRAN, extended minimal resource network (EMRAN) was proposed in [40] but this algorithm lags reasonable accuracy. In [16], Hyper MRAN (HMRAN) was formulated, with suitable input data sample dimension selection with hyper radial basis function (Hyper RBF) as an activation function for SHLFNs, to reduce the time complexity problem of MRAN by using localized extended Kalman filter approach also the hyper RBF activation function ensure high accuracy.…”
Section: F Hyper Minimal Resource Allocation Network (Hmran) Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…To overcome this problem with MRAN, extended minimal resource network (EMRAN) was proposed in [40] but this algorithm lags reasonable accuracy. In [16], Hyper MRAN (HMRAN) was formulated, with suitable input data sample dimension selection with hyper radial basis function (Hyper RBF) as an activation function for SHLFNs, to reduce the time complexity problem of MRAN by using localized extended Kalman filter approach also the hyper RBF activation function ensure high accuracy.…”
Section: F Hyper Minimal Resource Allocation Network (Hmran) Algorithmmentioning
confidence: 99%
“…The performance of this SHLFNs onlinesequential learning algorithm was improved over the MRAN algorithm. The parameters in MRAN are adjusted at each iteration using the extended Kalman filter method, which leads to the increment in the hidden layer neurons during the parameter update process [16]. Also, the size of the covariance matrix used in EKF is usually very large, which increases the computational complexity of the network structure.…”
Section: G Generalized Growing and Pruning Radial Basis Function (Ggmentioning
confidence: 99%
See 1 more Smart Citation
“…Ever since the pruning algorithm in MRAN was introduced, the MRAN algorithm has been successfully applied and furthermore, improvised to attain optimal neuron configurations in the hidden layer as in EMRAN and HMRAN developed by Li et al and Nishida et al in [9], [10] respectively. By removing unnecessary neurons in the hidden layer of the network, minimal computation power is required in the actual implementation of the system.…”
Section: Minimal Resource Allocation Networkmentioning
confidence: 99%