2018
DOI: 10.5705/ss.202015.0340
|View full text |Cite
|
Sign up to set email alerts
|

Learning Summary Statistic for Approximate Bayesian Computation via Deep Neural Network

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(26 citation statements)
references
References 0 publications
0
26
0
Order By: Relevance
“…For a given set of summary statistics, several computational methods -such as machine learning (Beaumont, 2019) -have been proposed to automatize the choice of informative summary statistics. ABC-DL relies on DL to define the informative summary statistics to be used in ABC (Wong et al, 2018). The joint site frequency spectrum (jSFS) has been previously used as input to the DL for comparing complex demographic models (Lorente-Galdos et al, 2019;Mondal et al, 2019).…”
Section: Abc-dlmentioning
confidence: 99%
“…For a given set of summary statistics, several computational methods -such as machine learning (Beaumont, 2019) -have been proposed to automatize the choice of informative summary statistics. ABC-DL relies on DL to define the informative summary statistics to be used in ABC (Wong et al, 2018). The joint site frequency spectrum (jSFS) has been previously used as input to the DL for comparing complex demographic models (Lorente-Galdos et al, 2019;Mondal et al, 2019).…”
Section: Abc-dlmentioning
confidence: 99%
“…Note that if the Z j follow a non-Gaussian distribution, the likelihood P (X | θ) is intractable. However here, we can draw The likelihood and prior are identical to (Wong et al, 2018). The Z i are standard Gaussian, and θ 1 , θ 2 are uniform over a specific triangular region such that they are identifiable.…”
Section: Order Two Moving Average Examplementioning
confidence: 99%
“…The idea of using RNNs as meta-learners was first introduced by Hochreiter et al (2001). Wong et al (2018) use deep neural networks to approximate posterior summaries -in particular they consider approximating the posterior mean of a functional in high-dimensional problems. Creel (2017) build on this work, and apply this approach to particular econometric models, using feedforward neural networks to estimate posterior means.…”
Section: Introductionmentioning
confidence: 99%
“…the shortest path length distance between the subgraphs induced by the infected nodes, at each time step). Subjectivity of these decisions can be removed through automatic summary selection for ABC, described in [42][43][44][45], where an informative linear or nonlinear combination of the summaries is chosen. To keep our summary statistics interpretable, we stick to the intuitive and domain-driven summary statistics and discrepancy measures described next.…”
Section: (B) Approximate Bayesian Computationmentioning
confidence: 99%
“…The different contributions in this literature vary depending on what type of spreading process is assumed to operate on the network. Processes that have been studied up to date include the independent cascade model [16], the SI process [17][18][19][49][50][51] which coincides with our simple contagion process, the SIR [52], the heterogeneous SIR [53] where the approximate likelihood has been demonstrated to be asymptotically valid in geometric trees [17], and spreading processes on infinite trees [52]. All approaches we are aware of use spreading process specific approximations of the likelihood function and apply them to specific network topologies.…”
Section: (D) Comparison With the Netsleuth Algorithmmentioning
confidence: 99%