2021
DOI: 10.48550/arxiv.2110.04393
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Randomized algorithms for rounding in the Tensor-Train format

Abstract: The Tensor-Train (TT) format is a highly compact low-rank representation for highdimensional tensors. TT is particularly useful when representing approximations to the solutions of certain types of parametrized partial differential equations. For many of these problems, computing the solution explicitly would require an infeasible amount of memory and computational time. While the TT format makes these problems tractable, iterative techniques for solving the PDEs must be adapted to perform arithmetic while mai… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
16
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(16 citation statements)
references
References 49 publications
0
16
0
Order By: Relevance
“…Compared to existing sketchingbased ALS algorithm, this algorithm yields better asymptotic computational cost under several regimes, such as when the CP rank is much lower than each dimension size of the input tensor. We also provide analysis on the recently introduced randomized tensor train rounding algorithm [10]. We show that the tensor train embedding used in that algorithm satisfies the accuracy sufficient condition in Section 3 and yields the optimal sketching asymptotic cost, implying that this is an efficient algorithm, and embeddings with other structures cannot achieve lower asymptotic cost.…”
Section: Introductionmentioning
confidence: 95%
See 4 more Smart Citations
“…Compared to existing sketchingbased ALS algorithm, this algorithm yields better asymptotic computational cost under several regimes, such as when the CP rank is much lower than each dimension size of the input tensor. We also provide analysis on the recently introduced randomized tensor train rounding algorithm [10]. We show that the tensor train embedding used in that algorithm satisfies the accuracy sufficient condition in Section 3 and yields the optimal sketching asymptotic cost, implying that this is an efficient algorithm, and embeddings with other structures cannot achieve lower asymptotic cost.…”
Section: Introductionmentioning
confidence: 95%
“…While we allow for the data tensor network to be a hypergraph, we consider only graph embeddings, (detailed definition in Section 2), which include tree embeddings that have been previously studied [1,10,4]. Each one of these embeddings consisting of N E tensors can be reduced to a sequence of N E sketches (random sketching matrices).…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations