2015
DOI: 10.1016/j.laa.2015.04.015
|View full text |Cite
|
Sign up to set email alerts
|

Optimization on the Hierarchical Tucker manifold – Applications to tensor completion

Abstract: MSC:In this work, we develop an optimization framework for problems whose solutions are well-approximated by Hierarchical Tucker (HT) tensors, an efficient structured tensor format based on recursive subspace factorizations. By exploiting the smooth manifold structure of these tensors, we construct standard optimization algorithms such as Steepest Descent and Conjugate Gradient for completing tensors from missing entries. Our algorithmic framework is fast and scalable to large problem sizes as we do not requir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
93
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 79 publications
(94 citation statements)
references
References 41 publications
1
93
0
Order By: Relevance
“…For example, [Uschmajew and Vandereycken, 2013] developed the manifold structure for the HT tensors, while Lubich et al [2013] developed the concept of dynamical low-rank approximation for both HT and TT formats. Moreover, Riemannian optimization in the Tucker and TT/HT formats has been successfully applied to large-scale tensor completion problems [Da Silva and Herrmann, 2013, Kasai and Mishra, 2015.…”
Section: Riemannian Optimization For Low-rank Tensor Manifoldsmentioning
confidence: 99%
“…For example, [Uschmajew and Vandereycken, 2013] developed the manifold structure for the HT tensors, while Lubich et al [2013] developed the concept of dynamical low-rank approximation for both HT and TT formats. Moreover, Riemannian optimization in the Tucker and TT/HT formats has been successfully applied to large-scale tensor completion problems [Da Silva and Herrmann, 2013, Kasai and Mishra, 2015.…”
Section: Riemannian Optimization For Low-rank Tensor Manifoldsmentioning
confidence: 99%
“…Examples are methods based on the alternating least squares approach and the density matrix renormalization group [31,51], as well as Riemannian optimization on fixed-rank tensor manifolds [13,36]. For further details and references concerning such methods, see also [28, §10] and [25].…”
Section: Relation To Previous Workmentioning
confidence: 99%
“…Using this several algorithms have been considered in the literature [12], [13], [14] for numerically efficient inferencing using optimization on smooth manifolds. Similarly, under the t-SVD and using the notion of tubal-rank as defined below, we see that the 3-D tensors of fixed tubal-rank also form a smooth manifold.…”
Section: Manifold Structure Of Fixed Rank Tensorsmentioning
confidence: 99%
“…For example, such scenarios typically arise in seismics where at a given location one typically records a temporal signal of certain length [14], [10]. In this case one is given observations X(i, j, :) at indices (i, j) 2 ⌦, i 2 {1, 2, ..., n 1 }, j 2 {1, 2, ..., n 2 }, where ⌦ denotes the sampling set.…”
Section: A Tubal-samplingmentioning
confidence: 99%