2019
DOI: 10.1007/s00211-019-01072-4
|View full text |Cite
|
Sign up to set email alerts
|

Stable ALS approximation in the TT-format for rank-adaptive tensor completion

Abstract: Low rank tensor completion is a highly ill-posed inverse problem, particularly when the data model is not accurate, and some sort of regularization is required in order to solve it. In this article we focus on the calibration of the data model. For alternating optimization, we observe that existing rank adaption methods do not enable a continuous transition between manifolds of different ranks. We denote this characteristic as instability (under truncation). As a consequence of this property, arbitrarily small… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
27
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(27 citation statements)
references
References 42 publications
0
27
0
Order By: Relevance
“…When |v| y = |v(y)| is used, u M,n is known as the nonlinear least squares estimator of u. The extensive interest in machine learning in recent years has lead to the investigation of this estimator for special model classes like sparse vectors [3,7,8], low-rank tensors [5,[9][10][11][12] and neural networks [13,14]. However, to the knowledge of the authors no investigation for general model classes has been published so far.…”
Section: Convergence Bounds For Empirical Nonlinear Least-squaresmentioning
confidence: 99%
“…When |v| y = |v(y)| is used, u M,n is known as the nonlinear least squares estimator of u. The extensive interest in machine learning in recent years has lead to the investigation of this estimator for special model classes like sparse vectors [3,7,8], low-rank tensors [5,[9][10][11][12] and neural networks [13,14]. However, to the knowledge of the authors no investigation for general model classes has been published so far.…”
Section: Convergence Bounds For Empirical Nonlinear Least-squaresmentioning
confidence: 99%
“…Although, not directly suitable for recovery tasks, it became apparent that DMRG and ALS can be adapted to work in this context. Two of these extensions to the ALS algorithm are the stablilized ALS approximation (SALSA) [21] and the block alternating steepest descent for Recovery (bASD) algorithm [13]. Both adapt the tensor network ranks and are better suited to the problem of data identification.…”
Section: Introductionmentioning
confidence: 99%
“…It is possible to include rank adaptivity as in SALSA[21] or bASD[13] and we have noted this in the relevant places.2 The orthogonality comes from the symmetry of L which results in orthogonal eigenspaces.Frontiers in Applied Mathematics and Statistics | www.frontiersin.org September 2021 | Volume 7 | Article 702486…”
mentioning
confidence: 99%
“…Tensor completion has been widely studied in the literature. Various approaches like decomposition-based methods [29][30][31][32][33][34] are available in the literature. Various other approaches can be found in other works.…”
Section: Introductionmentioning
confidence: 99%