2024
DOI: 10.3934/naco.2022038
|View full text |Cite
|
Sign up to set email alerts
|

The efficacy and generalizability of conditional GANs for posterior inference in physics-based inverse problems

Abstract: In this work, we train conditional Wasserstein generative adversarial networks to effectively sample from the posterior of physics-based Bayesian inference problems. The generator is constructed using a U-Net architecture, with the latent information injected using conditional instance normalization. The former facilitates a multiscale inverse map, while the latter enables the decoupling of the latent space dimension from the dimension of the measurement, and introduces stochasticity at all scales of the U-Net… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 9 publications
(14 citation statements)
references
References 28 publications
0
14
0
Order By: Relevance
“…where W 1 is Wasserstein-1 metric [33]. The Lipschitz constraint on the critic can be weakly imposed using a gradient penalty term while training D [4,26] (also see Supplementary Note 3).…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…where W 1 is Wasserstein-1 metric [33]. The Lipschitz constraint on the critic can be weakly imposed using a gradient penalty term while training D [4,26] (also see Supplementary Note 3).…”
Section: Methodsmentioning
confidence: 99%
“…It is typical to use residual blocks to introduce non-linearity in the U-Net, which is what was also done in [26]. However, in the present work, we make use of dense blocks since they lead to superior performance compared to residual blocks while reducing the number of trainable parameters [13].…”
Section: Cgan Architecturementioning
confidence: 99%
See 3 more Smart Citations