2019
DOI: 10.1016/j.cam.2019.03.002
|View full text |Cite
|
Sign up to set email alerts
|

Greedy low-rank approximation in Tucker format of solutions of tensor linear systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(10 citation statements)
references
References 37 publications
0
10
0
Order By: Relevance
“…Different from these works, our proposed ISLET is a one-step procedure that only involves solving a simple least squares regression after performing dimension reduction on covariates by importance sketching (see Steps 1 and 2 in Section 2.2). Moreover, many prior works mainly focused on computational aspects of their proposed methods [7,13,42,48,51], while we show that ISLET is not only computationally efficient (see more discussion and comparison on computation complexity in Section 2.2 Computation and Implementation part) but also has optimal theoretical guarantees in terms of mean square error under the statistical setting.…”
Section: Related Literaturementioning
confidence: 82%
See 1 more Smart Citation
“…Different from these works, our proposed ISLET is a one-step procedure that only involves solving a simple least squares regression after performing dimension reduction on covariates by importance sketching (see Steps 1 and 2 in Section 2.2). Moreover, many prior works mainly focused on computational aspects of their proposed methods [7,13,42,48,51], while we show that ISLET is not only computationally efficient (see more discussion and comparison on computation complexity in Section 2.2 Computation and Implementation part) but also has optimal theoretical guarantees in terms of mean square error under the statistical setting.…”
Section: Related Literaturementioning
confidence: 82%
“…[7,10] proposed iterative projection methods to solve large-scale linear systems with Kroneckerproduct-type design matrices. [48] introduced a greedy approach. [69,70] considered Riemannian optimization methods and tensor Krylov subspace methods, respectively.…”
Section: Related Literaturementioning
confidence: 99%
“…For large-scale problems sparse direct solvers eventually fail due to memory limitations. The methods described in [7] can then be considered.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…For example, Boussé et al (2018) developed the algebraic method and Gauss-Newton method to solve the linear system with a CP low-rank tensor solution. Georgieva and Hofreither (2019) and Kressner et al (2016) respectively introduced a greedy approach and an approximate Riemannian Newton method to approximate the linear system by a low Tucker rank tensor. The readers are also referred to Grasedyck et al (2013) for a recent survey.…”
Section: Related Literaturementioning
confidence: 99%