2018
DOI: 10.1016/j.automatica.2018.05.015
|View full text |Cite
|
Sign up to set email alerts
|

Tensor network subspace identification of polynomial state space models

Abstract: This article introduces a tensor network subspace algorithm for the identification of specific polynomial state space models. The polynomial nonlinearity in the state space model is completely written in terms of a tensor network, thus avoiding the curse of dimensionality. We also prove how the block Hankel data matrices in the subspace method can be exactly represented by low rank tensor networks, reducing the computational and storage complexity significantly. The performance and accuracy of our subspace ide… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
20
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(20 citation statements)
references
References 17 publications
0
20
0
Order By: Relevance
“…Tensor algebra is the next evolutionary step in linear algebra since it deals primarily with the simultaneous coupling of three or more vector spaces or with vectors of three or more dimensions [34]. Also, tensors can be used in the identification of non-linear systems [35][36][37]. Tensors and their factorizations have a wide array of applications to various engineering fields.…”
Section: Previous Workmentioning
confidence: 99%
“…Tensor algebra is the next evolutionary step in linear algebra since it deals primarily with the simultaneous coupling of three or more vector spaces or with vectors of three or more dimensions [34]. Also, tensors can be used in the identification of non-linear systems [35][36][37]. Tensors and their factorizations have a wide array of applications to various engineering fields.…”
Section: Previous Workmentioning
confidence: 99%
“…1,[6][7][8][9][10][11] In case the relationship between input/output data exhibits nonlinear phenomena, a nonlinear model approximation is needed. The nonlinearities can be represented by various models such as basis function expansions, 12 Volterra models, [13][14][15] Gaussian processes, 16,17 neural networks, 18,19 trees, 20,21 and so on. The nonlinear model can also be approximated by a linear model with time-variant parameters via robust Kalman filter.…”
Section: Introductionmentioning
confidence: 99%
“…The intense storage requirement may also explain why only few contributions on the MIMO Volterra system identification can be found in the recent years. 14,15,37,38 Tensor network (TN) can relieve the curse of dimensionality by referring to techniques for multidimensional arrays, that is, tensors. For example, the storage cost of a square n d × n d matrix is typically n 2d , which can be prohibitively large as d increases.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…connected low-order tensors [16]. Combined with tensor algebra, tensor network structures can greatly decrease the computational complexity of many applications [17,18,19]. Due to their multilinear nature, multivariate B-splines easily admit a tensor network representation, which we call the Tensor Network B-splines (TNBS) model.…”
Section: Introductionmentioning
confidence: 99%