2012
DOI: 10.1016/j.neucom.2011.07.013
|View full text |Cite
|
Sign up to set email alerts
|

An adaptive wavelet differential neural networks based identifier and its stability analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(8 citation statements)
references
References 20 publications
0
8
0
Order By: Relevance
“…The optimal weight derived from this stage is used to generate an initial 100 population for a GA, which was run for 500 generations to obtain the optimal values of the unknown parameters of the WNN. This is a case of non-linear MIMO system, which is more challenging than most recently reported SISO in literature [8,18,28,41] , and the training approach was adopted for the following reasons in order to avoid the problem of being trapped in a local minimum solution. Moreover, it was shown after some heuristic study that the proposed is a faster approach.…”
Section: The Term δϕX δXmentioning
confidence: 99%
“…The optimal weight derived from this stage is used to generate an initial 100 population for a GA, which was run for 500 generations to obtain the optimal values of the unknown parameters of the WNN. This is a case of non-linear MIMO system, which is more challenging than most recently reported SISO in literature [8,18,28,41] , and the training approach was adopted for the following reasons in order to avoid the problem of being trapped in a local minimum solution. Moreover, it was shown after some heuristic study that the proposed is a faster approach.…”
Section: The Term δϕX δXmentioning
confidence: 99%
“…Methods based on neural networks are effective when applied to the identification of dynamic nonlinear systems, which are also effective in combinatorial optimization [15], optimization of non-stationary functions [16], multi-objective optimization [17], bioinformatics [18]. The WNN with the merits of local optimization, approximation and self-learning in the time-frequency domain provides a framework that can be successfully applied to the dynamic models [19][20][21][22] and the nonlinearity control [23][24][25][26]. In [19], the wavelet representation with high-efficiency coupled map lattices trains the basic structure of the WNN, and a novel two-stage hybrid training scheme is developed for constructing a parsimonious WNN to improve the identification efficiency.…”
Section: Introductionmentioning
confidence: 99%
“…In [19], the wavelet representation with high-efficiency coupled map lattices trains the basic structure of the WNN, and a novel two-stage hybrid training scheme is developed for constructing a parsimonious WNN to improve the identification efficiency. A differential evolution algorithm in [20] and a storage strategy in [21] are used to optimize the WNN parameters in various ways, which guarantees the performance and robustness of the system in the sense of Lyapunov stability.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…As a result, training algorithms for WNNs typically converge in a smaller number of iterations when compared to ANNs; (ii) It has been argued that for a given quality of approximation, fewer nodes may be required for WNNs, when compared to ANNs using sigmoidal functions. Moreover, many researchers have studied the properties and applications of the WNNs models to represent nonlinear systems [15][16][17][18][19][20][21].…”
Section: Introductionmentioning
confidence: 99%