2011
DOI: 10.1137/090764827
|View full text |Cite
|
Sign up to set email alerts
|

Best Low Multilinear Rank Approximation of Higher-Order Tensors, Based on the Riemannian Trust-Region Scheme

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
88
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 112 publications
(88 citation statements)
references
References 34 publications
0
88
0
Order By: Relevance
“…Algorithms for solving problem (2.2) have been proposed in [35,36,37,38,31]. In the higher-order power method of [36] each iteration rotates mass to the target subtensor by using the singular value decomposition (SVD) of the columns of one of the three matrix unfoldings of the rotated tensor corresponding to the subtensor.…”
Section: Problem (22) Is Equivalent To Maximizing (Smentioning
confidence: 99%
“…Algorithms for solving problem (2.2) have been proposed in [35,36,37,38,31]. In the higher-order power method of [36] each iteration rotates mass to the target subtensor by using the singular value decomposition (SVD) of the columns of one of the three matrix unfoldings of the rotated tensor corresponding to the subtensor.…”
Section: Problem (22) Is Equivalent To Maximizing (Smentioning
confidence: 99%
“…However, the theoretical justification of such methods, even for relatively simple low-rank matrix cases, is still an open problem. Recently, in order to achieve a better global convergence, an alternative version of the Riemannian Newton method called the Trust-Region scheme was developed by Absil et al [2007], Boumal and Absil [2011], Ishteva et al [2011].…”
Section: Riemannian Newton Methodsmentioning
confidence: 99%
“…Moreover, from a Riemannian geometry point of view, constrained optimization problems can often be viewed as unconstrained ones. It is therefore natural to ask whether Riemannian optimization can also help open up new research directions in conjunction with tensor networks; see for example [Ishteva et al, 2011, Kasai and Mishra, 2015, Sato et al, 2017, Steinlechner, 2016a. Riemannian optimization for tensors can be formulated in the following generalized form…”
Section: Riemannian Optimization For Low-rank Tensor Manifoldsmentioning
confidence: 99%
“…First, using the rankest function of Tensorlab, an upper bound of R = 4 and R = 9 was obtained for HH tensors and W tensors, respectively. Afterwards, these values were refined as follows: CPD and the low multilinear rank approximation (LMLRA) [12] were performed with increasing R, starting from R = 1, until the desired approximation was achieved. The desired approximation was defined in terms of the relative error (relerr), which was chosen here as 0.2.…”
Section: Tensor Decompositionmentioning
confidence: 99%