2020
DOI: 10.1007/s10107-020-01531-z
|View full text |Cite
|
Sign up to set email alerts
|

Optimization landscape of Tucker decomposition

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 18 publications
0
6
0
Order By: Relevance
“…Low-rank tensor estimation with Tucker decomposition. [FG20] analyzed the landscape of Tucker decomposition for tensor factorization, and showed benign landscape properties with suitable regularizations. [GRY11,MHWG14] developed convex relaxation algorithms based on minimizing the nuclear norms of unfolded tensors for tensor regression, and similar approaches were developed in [HMGW15] for robust tensor completion.…”
Section: Additional Related Workmentioning
confidence: 99%
“…Low-rank tensor estimation with Tucker decomposition. [FG20] analyzed the landscape of Tucker decomposition for tensor factorization, and showed benign landscape properties with suitable regularizations. [GRY11,MHWG14] developed convex relaxation algorithms based on minimizing the nuclear norms of unfolded tensors for tensor regression, and similar approaches were developed in [HMGW15] for robust tensor completion.…”
Section: Additional Related Workmentioning
confidence: 99%
“…Song et al (2019) gave polynomial-time (1 + ε)-approximation algorithms for many types of low-rank tensor decompositions with respect to the Frobenius norm, including CP and Tucker decompositions. Frandsen & Ge (2022) showed that if a third-order tensor has an exact Tucker decomposition, then all local minima of an appropriately regularized loss landscape are globally optimal. Several works recently studied Tucker decomposition in streaming models (Traoré et al, 2019;Sun et al, 2020) and a sliding window model (Jang & Kang, 2021).…”
Section: Related Workmentioning
confidence: 99%
“…It is also important to keep in mind that as the optimization problem is nonconvex, we cannot guarantee that a local optimum is a global one, see Reference 32.…”
Section: Tensor Concepts and Preliminariesmentioning
confidence: 99%