2022
DOI: 10.1137/20m1314653
|View full text |Cite
|
Sign up to set email alerts
|

Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format

Abstract: Low-rank tensor approximations have shown great potential for uncertainty quantification in high dimensions, for example, to build surrogate models that can be used to speed up largescale inference problems (Eigel et al., Inverse Problems 34, 2018; Dolgov et al., Statistics & Computing 30, 2020). The feasibility and efficiency of such approaches depends critically on the rank that is necessary to represent or approximate the underlying distribution. In this paper, a-priori rank bounds for approximations in th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 45 publications
0
5
0
Order By: Relevance
“…Some theoretical results exist that provide rank bounds. For example, the work of [24] establishes specific bounds for certain multivariate Gaussian densities that depend poly-logarithmically on m, while the work of [13] proves dimension-independent bounds for general functions in weighted spaces with dominating mixed smoothness.…”
Section: 3mentioning
confidence: 99%
“…Some theoretical results exist that provide rank bounds. For example, the work of [24] establishes specific bounds for certain multivariate Gaussian densities that depend poly-logarithmically on m, while the work of [13] proves dimension-independent bounds for general functions in weighted spaces with dominating mixed smoothness.…”
Section: 3mentioning
confidence: 99%
“…We sketch three possibilities how to arrive at such a low-rank representation of the density. One is via tensor function representations 4,[8][9][10][11] of the pdf, 12,13 another via an analogous representation of the probabilistic characteristic function (pcf), [14][15][16] and a third one is via a low-rank representation of the RV itself-for example, References 17 and 18.…”
Section: Motivation and Main Ideamentioning
confidence: 99%
“…The first starting point is the assumption that one has or may obtain a low-rank tensor function representation 4,[8][9][10][11] of the pdf. 12,13 The second possible starting point in is that one has such a low-rank tensor function of the pcf. [14][15][16]22,23 Then via the Fourier transform one may come back to the pdf.…”
Section: Motivation and Main Ideamentioning
confidence: 99%
See 2 more Smart Citations