2018
DOI: 10.1002/nla.2135
|View full text |Cite
|
Sign up to set email alerts
|

Approximate decoupling of multivariate polynomials using weighted tensor decomposition

Abstract: Many scientific and engineering disciplines use multivariate polynomials.Decomposing a multivariate polynomial vector function into a sandwiched structure of univariate polynomials surrounded by linear transformations can provide useful insight into the function while reducing the number of parameters. Such a decoupled representation can be realized with techniques based on tensor decomposition methods, but these techniques have only been studied in the exact case. Generalizing the existing techniques to the n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 38 publications
0
2
0
Order By: Relevance
“…In both cases, an interesting open question remains how to impose the structure directly on 17 the rank-one components, without resorting to coupled tensor factorizations. Another important question is how to address the approximate decoupling problem, i.e., when we are dealing with noise (see [44] for results on the unstructured case).…”
Section: Discussionmentioning
confidence: 99%
“…In both cases, an interesting open question remains how to impose the structure directly on 17 the rank-one components, without resorting to coupled tensor factorizations. Another important question is how to address the approximate decoupling problem, i.e., when we are dealing with noise (see [44] for results on the unstructured case).…”
Section: Discussionmentioning
confidence: 99%
“…This paper reflects only the authors' views and the Union is not liable for any use that may be made of the contained information; (3) KU Leuven Internal Funds C16/15/059. admit general weight tensors [10,11], it can be beneficial to use a low-rank weight tensor. First, low-rank structure can be exploited to make the decomposition less demanding, both in storage and computation, compared to using a full weight tensor [12].…”
Section: Introductionmentioning
confidence: 99%