2000
DOI: 10.1049/ip-cta:20000549
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of minimal radial basis function network algorithm for real-time identification of nonlinear dynamic systems

Abstract: This paper presents first a performance analysis of the recently developed Minimal Resource Allocating Network (MRAN) algorithm for on-line identification of nonlinear dynamic systems. Using nonlinear time invariant and time varying identification benchmark problems, MRAN's performance is compared with the recently proposed On-line Structural Adaptive Hybrid Learning (ONSAHL) algorithm of Junge and Unbehauen. The results indicate that MRAN realizes networks using fewer hidden neurons than ONSAHL algorithm with… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
59
0
2

Year Published

2004
2004
2016
2016

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 97 publications
(61 citation statements)
references
References 17 publications
0
59
0
2
Order By: Relevance
“…Ever since the pruning algorithm in MRAN was introduced, the MRAN algorithm has been successfully applied and furthermore, improvised to attain optimal neuron configurations in the hidden layer as in EMRAN and HMRAN developed by Li et al and Nishida et al in [9], [10] respectively. By removing unnecessary neurons in the hidden layer of the network, minimal computation power is required in the actual implementation of the system.…”
Section: Minimal Resource Allocation Networkmentioning
confidence: 99%
“…Ever since the pruning algorithm in MRAN was introduced, the MRAN algorithm has been successfully applied and furthermore, improvised to attain optimal neuron configurations in the hidden layer as in EMRAN and HMRAN developed by Li et al and Nishida et al in [9], [10] respectively. By removing unnecessary neurons in the hidden layer of the network, minimal computation power is required in the actual implementation of the system.…”
Section: Minimal Resource Allocation Networkmentioning
confidence: 99%
“…If the growing criteria in (17) and (18) are not met, only the network parameters of the nearest node to the current inputs are updated using the extended Kalman filter (EKF) as follows [9] …”
Section: ) Adjusting the Network Parameters With Ekfmentioning
confidence: 99%
“…To obtain the desired behavior i y responding to the input i x , where i is the sample index, the neural network is trained using a sequential algorith m with a Growing and Pruning strategy (GAP) [9] …”
Section: B Sequential Network Training Algorithmmentioning
confidence: 99%
See 2 more Smart Citations