2018
DOI: 10.1155/2018/7353171
|View full text |Cite
|
Sign up to set email alerts
|

Particle Swarm Optimization Iterative Identification Algorithm and Gradient Iterative Identification Algorithm for Wiener Systems with Colored Noise

Abstract: This paper considers the parameter identification of Wiener systems with colored noise. The difficulty in the identification is that the model is nonlinear and the intermediate variable cannot be measured. Particle swarm optimization is an artificial intelligence evolutionary method and is effective in solving nonlinear optimization problem. In this paper, we obtain the identification model of the Wiener system and then transfer the parameter identification problem into an optimization problem. Then, we derive… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
14
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(14 citation statements)
references
References 52 publications
0
14
0
Order By: Relevance
“…Remark 2. Based on equations (12) and 13, we can get that each cost function has different weights, and the latest data have larger weights. Taking the conditional expectation on both sides of (13) yields…”
Section: Weight Ls Algorithm Rewrite the Cost Function Asmentioning
confidence: 99%
See 1 more Smart Citation
“…Remark 2. Based on equations (12) and 13, we can get that each cost function has different weights, and the latest data have larger weights. Taking the conditional expectation on both sides of (13) yields…”
Section: Weight Ls Algorithm Rewrite the Cost Function Asmentioning
confidence: 99%
“…Once the structure is determined, parameter estimation is involved [5][6][7]. Recently, a lot of parameter estimation algorithms are proposed, including the least squares (LS) algorithm [8,9], the gradient algorithm [10,11], the particle swarm optimization algorithm, and the expectation maximization algorithm [12,13]. Among these algorithms, the LS algorithm is to find a vector that is a local minimizer to a function that is a sum of squares; thus, it is the simplest algorithm and is widely used.…”
Section: Introductionmentioning
confidence: 99%
“…It is di cult to identify the Wiener model because of its nonlinear characteristic. Additionally, the existing research studies are mostly related to the system without noise or only with the output noise [7][8][9]. Lamia et al described the Wiener model using the polynomial nonlinear state space (PNLSS) model and developed an output error identi cation method for the nonlinear block [10].…”
Section: Introductionmentioning
confidence: 99%
“…The core idea of adaptive parameter estimation is to recursively adjust the parameters by using the residuals, which makes the estimates approximately approach the true value. Under this framework of adaptation, the recursive least squares algorithms [13,14], the stochastic gradient algorithms [15], and the iterative algorithms [16,17] are well developed and underpin several heuristic or bioinspired learning algorithms [18][19][20]. For example, we use the adaptive algorithms such as gradient descent algorithm for learning the weights in the neural networks or training the fitness in the genetic algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…In this simulation, the input vector u t is taken as an uncorrelated persistent excitation signal sequence with zero mean and v t as a white noise sequence with zero mean and variances σ 2 = 0 10 2 and σ 2 = 0 50 2 . Taking the data length L = 1000 data, we apply the proposed E-NI algorithm in (17), (18), (19), (20) Tables 1-3, and their parameter estimation errors δ ≔ θ t − θ / θ and δ ≔ ϑ t − ϑ / ϑ versus t of each algorithm are illustrated in Figures 1-3.…”
mentioning
confidence: 99%