2010
DOI: 10.1016/j.jco.2010.02.002
|View full text |Cite
|
Sign up to set email alerts
|

Multi-level Monte Carlo algorithms for infinite-dimensional integration on RN

Abstract: a b s t r a c tWe study randomized algorithms for numerical integration with respect to a product probability measure on the sequence space R N . We consider integrands from reproducing kernel Hilbert spaces, whose kernels are superpositions of weighted tensor products. We combine tractability results for finite-dimensional integration with the multi-level technique to construct new algorithms for infinite-dimensional integration. These algorithms use variable subspace sampling, and we compare the power of var… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 43 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…Integration in the infinite-dimensional setting has been analysed in a number of recent papers [10,15,34,38,44], mostly in a Hilbert space setting. In a formal sense, there is little difficulty in considering integration with an infinite number of variables; for a function F of infinitely many variables y = (y 1 , y 2 , .…”
Section: Infinite-dimensional Integrationmentioning
confidence: 99%
“…Integration in the infinite-dimensional setting has been analysed in a number of recent papers [10,15,34,38,44], mostly in a Hilbert space setting. In a formal sense, there is little difficulty in considering integration with an infinite number of variables; for a function F of infinitely many variables y = (y 1 , y 2 , .…”
Section: Infinite-dimensional Integrationmentioning
confidence: 99%
“…Mitigating or circumventing the CoD is the primary goal when it comes to constructing efficient high-dimensional numerical quadrature rules. A lot of progress has been made in this direction in the past 50 years, including sparse grid (SG) methods [1,[5][6][7], Monte Carlo (MC) methods [8,9], quasi-Monte Carlo (QMC) methods [10][11][12][13][14], and deep neural network (DNN) methods [15][16][17][18][19]. To some certain extent, these methods are effective for computing integrals in low and medium dimensions (i.e., d 100), but it is still a challenge for them to compute integrals in very high dimensions (i.e., d ≈ 1000).…”
Section: Introductionmentioning
confidence: 99%
“…Mitigating or circumventing the CoD has been the primary goal when it comes to constructing efficient high-dimensional numerical quadrature rules. A lot of progress has been made in this direction in the past fifty years, this includes sparse grid (SG) methods [1,[5][6][7], Monte Carlo (MC) methods [8,9], Quasi-Monte Carlo (QMC) methods [10][11][12][13][14], deep neural network (DNN) methods [15][16][17][18][19]. To some certain extent, those methods are effective for computing integrals in low and medium dimensions (i.e., d 100), but it is still a challenge for them to compute integrals in very high dimensions (i.e., d ≈ 1000).…”
Section: Introductionmentioning
confidence: 99%