2020
DOI: 10.48550/arxiv.2006.10571
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Likelihood-Free Inference with Deep Gaussian Processes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…Our benchmark, in its current form, has several limitations. First, the algorithms considered here do not cover the entire spectrum of SBI algorithms: We did not include sequential algorithms using active learning or Bayesian Optimization Aushev et al, 2020), or 'gray-box' algorithms, which use additional information about or from the simulator (e.g., . We focused on approaches using neural networks for density estimation and did not compare to alternatives using Gaussian Processes (e.g., Meeds and Welling, 2014;Wilkinson, 2014).…”
Section: Limitationsmentioning
confidence: 99%
“…Our benchmark, in its current form, has several limitations. First, the algorithms considered here do not cover the entire spectrum of SBI algorithms: We did not include sequential algorithms using active learning or Bayesian Optimization Aushev et al, 2020), or 'gray-box' algorithms, which use additional information about or from the simulator (e.g., . We focused on approaches using neural networks for density estimation and did not compare to alternatives using Gaussian Processes (e.g., Meeds and Welling, 2014;Wilkinson, 2014).…”
Section: Limitationsmentioning
confidence: 99%
“…The price is that they require significantly more simulations than Bayesian optimisation (BO) approaches, as seen in Sect. 4.3 of Aushev et al (2020).…”
Section: Sequential Neural Estimationmentioning
confidence: 98%
“…BO finds applications in various scientific and industrial domains, e.g., machine learning for hyperparameter optimization [37], [38], modeling of population genetics [39], spreading of pathogens [40], atomic structure of materials [41], [42], as well as cosmology [43], and establishes as a state-of-the-art method in lower-dimensional problems [17], [44]. However, the BO performances and its computational efficiency decline as the dimensionality of a problem increases [37], [45]- [47], which is the case with the calibration of large-scale ABM that features a large number of behavioral parameters to be tuned.…”
Section: B Bayesian Optimizationmentioning
confidence: 99%