We consider the problem of fitting a low rank tensor A ∈ R I , I = {1, . . . , n} d , to a given set of data points {M i ∈ R | i ∈ P }, P ⊂ I. The low rank format under consideration is the hierarchical or TT or MPS format. It is characterized by rank bounds r on certain matricizations of the tensor. The number of degrees of freedom is in O(r 2 dn). For a fixed rank and mode size n we observe that it is possible to reconstruct random (but rank structured) tensors as well as certain discretized multivariate (but rank structured) functions from a number of samples that is in O(log N ) for a tensor having N = n d entries. We compare an alternating least squares fit (ALS) to an overrelaxation scheme inspired by the LMaFit method for matrix completion. Both approaches aim at finding a tensor A that fulfils the first order optimality conditions by a nonlinear Gauss-Seidel type solver that consists of an alternating fit cycling through the directions µ = 1, . . . , d. The least squares fit is of complexity O(r 4 d#P ) per step, whereas each step of ADF is in O(r 2 d#P ), albeit with a slightly higher number of necessary steps. In the numerical experiments we observe robustness of the completion algorithm with respect to noise and good reconstruction capability. Our tests provide evidence that the algorithm is suitable in higher dimension (>10) as well as for moderate ranks.
Low rank tensor completion is a highly ill-posed inverse problem, particularly when the data model is not accurate, and some sort of regularization is required in order to solve it. In this article we focus on the calibration of the data model. For alternating optimization, we observe that existing rank adaption methods do not enable a continuous transition between manifolds of different ranks. We denote this characteristic as instability (under truncation). As a consequence of this property, arbitrarily small changes in the iterate can have arbitrarily large influence on the further reconstruction. We therefore introduce a singular value based regularization to the standard alternating least squares (ALS), which is motivated by averaging in microsteps. We prove its stability and derive a natural semi-implicit rank adaption strategy. We further prove that the standard ALS microsteps for completion problems are only stable on manifolds of fixed ranks, and only around points that have what we define as internal tensor restricted isometry property, iTRIP. In conclusion, numerical experiments are provided that show improvements of the reconstruction quality up to orders of magnitude in the new Stable ALS Approximation (SALSA) compared to standard ALS and the well known Riemannian optimization RTTC.
Tree tensor networks such as the tensor train format are a common tool for high dimensional problems. The associated multivariate rank and accordant tuples of singular values are based on different matricizations of the same tensor. While the behavior of such is as essential as in the matrix case, here the question about the feasibility of specific constellations arises: which prescribed tuples can be realized as singular values of a tensor and what is this feasible set? We first show the equivalence of the tensor feasibility problem (TFP) to the quantum marginal problem (QMP). In higher dimensions, in case of the tensor train (TT-)format, the conditions for feasibility can be decoupled. By present results for three dimensions for the QMP, it then follows that the tuples of squared, feasible TT-singular values form polyhedral cones. We further establish a connection to eigenvalue relations of sums of Hermitian matrices, which in turn are described by sets of interlinked, so called honeycombs, as they have been introduced by Knutson and Tao. Besides a large class of universal, necessary inequalities as well as the vertex description for a special, simpler instance, we present a linear programming algorithm to check feasibility and a simple, heuristic algorithm to construct representations of tensors with prescribed, feasible TT-singular values in parallel.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.