2019
DOI: 10.1002/acs.3007
|View full text |Cite
|
Sign up to set email alerts
|

Maximum likelihood gradient identification for multivariate equation‐error moving average systems using the multi‐innovation theory

Abstract: For the multivariate equation-error moving average system, a multivariate maximum likelihood multi-innovation extended stochastic gradient (M-ML-MIESG) algorithm is delivered. The key is to decompose the system into several regressive identification subsystems according to the number of the system outputs. Then, a multivariate maximum likelihood extended stochastic gradient algorithm is presented to estimate the parameters of these subsystems. The M-ML-MIESG algorithm has higher parameter estimation accuracy t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 67 publications
(98 reference statements)
0
4
0
Order By: Relevance
“…As a comparison, we give the decomposition‐based multivariate maximum likelihood stochastic gradient (D‐M‐ML‐SG) algorithm for the multivariate equation‐error autoregressive system based on the work in Reference 91. Then we can obtain the D‐M‐ML‐SG algorithm for estimating bold-italicϑfalse(kfalse)$$ \boldsymbol{\vartheta} (k) $$: leftalignedleftalign-oddϑ^j(k)align-even=ϑ^j(k1)+φ^jf(k)rj(k)ej(k),$$ {\hat{\boldsymbol{\vartheta}}}_j(k)\kern0.5em ={\hat{\boldsymbol{\vartheta}}}_j\left(k-1\right)+\frac{{\hat{\boldsymbol{\varphi}}}_{jf}(k)}{r_j(k)}{e}_j(k), $$ leftalignedleftalign-oddrj(k)align-even=rj(k1)+φ^jf(k)2,$$ {r}_j(k)\kern0.5em ={r}_j\left(k-1\right)+{\left\Vert {\hat{\boldsymbol{\varphi}}}_{jf}(k)\right\Vert}^2, $$ leftalignedleftalign-oddej(k)align-even=yj(k)φ^jT(k)ϑ^…”
Section: The D‐m‐ml‐sg Algorithmmentioning
confidence: 99%
“…As a comparison, we give the decomposition‐based multivariate maximum likelihood stochastic gradient (D‐M‐ML‐SG) algorithm for the multivariate equation‐error autoregressive system based on the work in Reference 91. Then we can obtain the D‐M‐ML‐SG algorithm for estimating bold-italicϑfalse(kfalse)$$ \boldsymbol{\vartheta} (k) $$: leftalignedleftalign-oddϑ^j(k)align-even=ϑ^j(k1)+φ^jf(k)rj(k)ej(k),$$ {\hat{\boldsymbol{\vartheta}}}_j(k)\kern0.5em ={\hat{\boldsymbol{\vartheta}}}_j\left(k-1\right)+\frac{{\hat{\boldsymbol{\varphi}}}_{jf}(k)}{r_j(k)}{e}_j(k), $$ leftalignedleftalign-oddrj(k)align-even=rj(k1)+φ^jf(k)2,$$ {r}_j(k)\kern0.5em ={r}_j\left(k-1\right)+{\left\Vert {\hat{\boldsymbol{\varphi}}}_{jf}(k)\right\Vert}^2, $$ leftalignedleftalign-oddej(k)align-even=yj(k)φ^jT(k)ϑ^…”
Section: The D‐m‐ml‐sg Algorithmmentioning
confidence: 99%
“…The potential features of the designed scheme are listed as below. The parameter product terms of the two nonlinear elements and the linear element are included for synchronous estimation schemes 10–15 . Unlike to the synchronous estimation, the presented method decouples parameters estimation of the two nonlinear elements from that of the linear element using designed hybrid signals, which simplifies parameters estimation process and avoids decomposing the parameter product terms for the Hammerstein‐Wiener model. Compared with the finite impulse response model used in previous work, 32 the autoregressive exogenous model is applied to model the linear element in current work, which simultaneously takes into account input–output data of the model and makes full use of the data information. The correlation analysis technique that can be derived from the cross‐covariance of input and output signals and the covariance of the input signal is developed, thus the problem that the intermediate variable information of the Hammerstein‐Wiener model cannot be measured is solved, and the interference of colored process noise in estimation is handled. …”
Section: Introductionmentioning
confidence: 99%
“…Over the last decades, a lot of work on estimation of the Hammerstein-Wiener model have been developed. The problem of the Hammerstein-Wiener model identification has been dealt with using different approaches, including over parameterization algorithms, 10,11 subspace method, 12 maximum likelihood techniques, 13,14 iterative algorithms, 15,16 blind approaches, 17,18 frequency-type identification methods, 19,20 and exciting signals-based techniques. 21,22 The static nonlinear element modeling is always an important research content for the Hammerstein-Wiener model, and model established with high precision and extension is a major challenge at present.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation