2019
DOI: 10.48550/arxiv.1912.04212
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Solving Bayesian Inverse Problems via Variational Autoencoders

Abstract: This work develops a model-aware autoencoder networks as a new method for solving scientific forward and inverse problems. Autoencoders are unsupervised neural networks that are able to learn new representations of data through appropriately selected architecture and regularization. The resulting mappings to and from the latent representation can be used to encode and decode the data. In our work, we set the data space to be the parameter space of a parameter of interest we wish to invert for. Further, as a wa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 44 publications
0
6
0
Order By: Relevance
“…we seek a reduced order approximation in an optimal linear subspace [50,90,63,68,91,92]. If the problem does not allow such representation, nonlinear variants (e.g., autoencoders) could be considered as data compression tools [98,99,100,101]. Here, we prefer to employ POD because it is generally faster than the nonlinear variants.…”
Section: Proper Orthogonal Decomposition (Pod)mentioning
confidence: 99%
“…we seek a reduced order approximation in an optimal linear subspace [50,90,63,68,91,92]. If the problem does not allow such representation, nonlinear variants (e.g., autoencoders) could be considered as data compression tools [98,99,100,101]. Here, we prefer to employ POD because it is generally faster than the nonlinear variants.…”
Section: Proper Orthogonal Decomposition (Pod)mentioning
confidence: 99%
“…Due to the benefit of being efficient, flexible and versatile, autoencoders are widely used in compressing the input data. It has shown some successful applications in the field of parameter estimation of PDEs [32,33,34,35,36,37,38]. Therefore, a convolutional AE is designed to compress the input power maps of the heat equations to be compact latent vectors.…”
Section: The Hybrid Framework Of Ae and Ig Based Networkmentioning
confidence: 99%
“…In addition to showing that a trained autoencoder has memory and computational advantages over existing methods, we develop an efficient data generation and training procedure based on the Bayesian inversion formulation that enables our approach to maintain its advantages, even when including the cost of training the autoencoder. While variations of autoencoders are often used for generative modeling (Kingma and Welling 2013), and others assign a physical interpretation to the latent space (Goh et al 2019), we use the autoencoding capabilities of the autoencoder architecture to allow the training process to find the most effective compressed latent representation for seismic input data. We provide a mathematical description of the FWI problem and the inverse problem in section 2.…”
Section: Introductionmentioning
confidence: 99%