2022
DOI: 10.48550/arxiv.2201.07925
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Large-scale Bayesian optimal experimental design with derivative-informed projected neural network

Abstract: We address the solution of large-scale Bayesian optimal experimental design (OED) problems governed by partial differential equations (PDEs) with infinite-dimensional parameter fields. The OED problem seeks to find sensor locations that maximize the expected information gain (EIG) in the solution of the underlying Bayesian inverse problem. Computation of the EIG is usually prohibitive for PDE-based OED problems. To make the evaluation of the EIG tractable, we approximate the (PDE-based) parameter-to-observable… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 51 publications
0
2
0
Order By: Relevance
“…The scalable solution of highdimensional outer-loop problems often requires derivative information, since this information detects map sensitivities that can make problems effectively low-dimensional. This property has been observed and used in many outer-loop problems such as model reduction for sampling and deep learning [4,5,13,34,36], optimization under uncertainty [3,15,17], Bayesian inverse problems [6,9,10,11,12,14,16,18,21,23,26,44], and Bayesian optimal experimental design [2,20,39,40,41].…”
Section: State Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…The scalable solution of highdimensional outer-loop problems often requires derivative information, since this information detects map sensitivities that can make problems effectively low-dimensional. This property has been observed and used in many outer-loop problems such as model reduction for sampling and deep learning [4,5,13,34,36], optimization under uncertainty [3,15,17], Bayesian inverse problems [6,9,10,11,12,14,16,18,21,23,26,44], and Bayesian optimal experimental design [2,20,39,40,41].…”
Section: State Modelmentioning
confidence: 99%
“…In this case there is less variance in the prior distribution, but more sensitivity to the diffusion operator due to the forcing terms and the cubic nonlinearity. The third test case is a convection-reaction-diffusion (CRD) problem where the parameter shows up in a nonlinear reaction term [36,41]. In this case the parameters for the distribution are δ = 1.0, γ = 0.1.…”
Section: Poisson Problemmentioning
confidence: 99%