2012
DOI: 10.1007/s00791-014-0218-7
|View full text |Cite
|
Sign up to set email alerts
|

A note on tensor chain approximation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 24 publications
(21 citation statements)
references
References 24 publications
0
21
0
Order By: Relevance
“…The partial trace operator can be used for describing the tensor chain (TC) decomposition [12,26] simply by slightly modifying the suggested TT representations. Properties of TC decomposition should be more investigated in the future work.…”
Section: Discussionmentioning
confidence: 99%
“…The partial trace operator can be used for describing the tensor chain (TC) decomposition [12,26] simply by slightly modifying the suggested TT representations. Properties of TC decomposition should be more investigated in the future work.…”
Section: Discussionmentioning
confidence: 99%
“…Algorithm 1 computes the quantity δ:=εTFd and requires a manual input of a divisor r 0 of rankδT1. The choice of r 0 by Zhao et al (and also for a related algorithm based on the skeleton/cross approximation), is to minimize |r0rankδT1r0|, but examples (see Section 5.1) show that this can lead to suboptimal compression ratios.…”
Section: Conversion From Full Format To Tr‐formatmentioning
confidence: 99%
“…We convert a tensor given in full format into a TR‐representation and compute its storage cost. We compare the TT‐representation with r 0 =1, Algorithm 1 using a balanced representation with r0=argmin|r0rankδT1r0|, and Algorithm 2, to Algorithm 3. We do not compare to other algorithms for TR‐decompositions, since these have already been compared to the TR‐SVD algorithm in the literature .…”
Section: Computational Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…The canonical polyadic decomposition (CPD) [2,15,16] and the Tucker decomposition [2,27] both generalize the notion of the matrix singular value decomposition (SVD) to higher-order tensors and have, therefore, received a lot of attention. More recent tensor decompositions are the tensor train (TT) [8,9,18,21] and the hierarchical Tucker decomposition [12,13]. It turns out that the latter two decompositions were already known in the quantum mechanics and condensed matter physics communities as the matrix product state (MPS) [23] and the tensor tree network (TTN) [25], respectively.…”
Section: Introductionmentioning
confidence: 99%