2021
DOI: 10.1090/mcom/3615
|View full text |Cite
|
Sign up to set email alerts
|

Quasi-Monte Carlo Bayesian estimation under Besov priors in elliptic inverse problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 40 publications
0
9
0
Order By: Relevance
“…We remark that in [88], MC and QMC integration has been analyzed by real-variable arguments for such Gaussian priors. In [61], corresponding results have been obtain also for so-called Besov priors, again by real-variable arguments for the parametric posterior. Since the presently developed, quantified parametric holomorphy results are independent of the particular measure placed upon the parameter domain R ∞ .…”
Section: Parametric Posterior Analyticity and Sparsity In Bipsmentioning
confidence: 78%
See 2 more Smart Citations
“…We remark that in [88], MC and QMC integration has been analyzed by real-variable arguments for such Gaussian priors. In [61], corresponding results have been obtain also for so-called Besov priors, again by real-variable arguments for the parametric posterior. Since the presently developed, quantified parametric holomorphy results are independent of the particular measure placed upon the parameter domain R ∞ .…”
Section: Parametric Posterior Analyticity and Sparsity In Bipsmentioning
confidence: 78%
“…Inserting into Θ(a) in (5.1), (5.2) this results in a countably-parametric density U ∋ y → Θ(a(y)), for y ∈ U = R ∞ , and the Gaussian reference measure π 0 on E in (5.1) is pushed forward into a countable Gaussian product measure on U : using (5.1) and choosing a Gaussian prior (e.g. [43,Section 2.4] or [61,79])…”
Section: Formulation and Well-posednessmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, for sufficiently smooth integrands it is possible to construct QMC rules with error bounds not depending on the number of stochastic variables while attaining faster convergence rates compared to Monte Carlo methods. For these reasons QMC methods have been very successful in applications to PDEs with random coefficients (see, e.g., [2,9,14,16,17,21,22,23,30,31,32,36,39,40]) and especially in PDE-constrained optimization under uncertainty, see [19,20]. In [29] the authors derive regularity results for the saddle point operator, which fall within the same framework as the QMC approximation of affine parametric operator equation setting considered in [40].…”
Section: Introductionmentioning
confidence: 99%
“…These features make QMC methods ideal for heavy-duty uncertainty quantification compared to regular Monte Carlo methods (slow convergence rate), sparse grids (non-parallelizable) or approximations based on lower order moments (poor accuracy). QMC methods have been applied successfully to many problems such as Bayesian inverse problems [11,28,45,46], spectral eigenvalue problems [19], optimization under uncertainty [25,26], the Schrödinger equation [53], the wave equation [16], problems arising in quantum physics [27], and various others.…”
mentioning
confidence: 99%