2022
DOI: 10.1109/tnnls.2020.3042395
|View full text |Cite
|
Sign up to set email alerts
|

BayesFlow: Learning Complex Stochastic Models With Invertible Neural Networks

Abstract: Estimating the parameters of mathematical models is a common problem in almost all branches of science. However, this problem can prove notably difficult when processes and model descriptions become increasingly complex and an explicit likelihood function is not available. With this work, we propose a novel method for globally amortized Bayesian inference based on invertible neural networks which we call BayesFlow. The method uses simulation to learn a global estimator for the probabilistic mapping from observ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
159
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 105 publications
(159 citation statements)
references
References 38 publications
0
159
0
Order By: Relevance
“…This loss guarantees that the networks recover the true posterior under perfect convergence [47].…”
Section: Inn-inference and Bayesflowmentioning
confidence: 95%
See 4 more Smart Citations
“…This loss guarantees that the networks recover the true posterior under perfect convergence [47].…”
Section: Inn-inference and Bayesflowmentioning
confidence: 95%
“…Second, since the number of measurements can vary in practice, we introduce a relatively small summary network h ψ with trainable parameters ψ. It reduces measurements of variable size to fixed-size vectors,x = h ψ (x), by respecting the probabilistic symmetry of the measurements [47]. For independent measurements we use a permutation invariant summary network such that its output is invariant under the ordering in x [47].…”
Section: Inn-inference and Bayesflowmentioning
confidence: 99%
See 3 more Smart Citations