2009
DOI: 10.2202/1557-4679.1171
|View full text |Cite
|
Sign up to set email alerts
|

Inference in Epidemic Models without Likelihoods

Abstract: Likelihood-based inference for epidemic models can be challenging, in part due to difficulties in evaluating the likelihood. The problem is particularly acute in models of large-scale outbreaks, and unobserved or partially observed data further complicates this process. Here we investigate the performance of Markov Chain Monte Carlo and Sequential Monte Carlo algorithms for parameter inference, where the routines are based on approximate likelihoods generated from model simulations. We compare our results to a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
179
0
3

Year Published

2010
2010
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 164 publications
(182 citation statements)
references
References 39 publications
0
179
0
3
Order By: Relevance
“…On the other hand, simulating realisations from a stochastic epidemic model is relatively straightforward. Therefore, ABC algorithms are very well suited to make inference for the parameters of epidemic models based on partially observed data and this has been illustrated when both temporal (McKinley et al, 2009) and non-temporal data Neal (2012) are available.…”
Section: Approximate Bayesian Computationmentioning
confidence: 99%
See 2 more Smart Citations
“…On the other hand, simulating realisations from a stochastic epidemic model is relatively straightforward. Therefore, ABC algorithms are very well suited to make inference for the parameters of epidemic models based on partially observed data and this has been illustrated when both temporal (McKinley et al, 2009) and non-temporal data Neal (2012) are available.…”
Section: Approximate Bayesian Computationmentioning
confidence: 99%
“…An intuitive distance metric is the sum-of-squared differences between observed and simulated counts (L 2 -norm) or perhaps a sum-of-absolute differences (L 1 -norm). Another option is to use a distance metric based on a chi-squared goodness-of-fit criterion (see, McKinley et al, 2009). This is very similar to an L 2 -norm but with the contribution at each bin scaled by the observed data (number of removals in each bin).…”
Section: Abcmentioning
confidence: 99%
See 1 more Smart Citation
“…Joyce and Marjoram (2008) have proposed an inclusion-exclusion scheme assuming we have at our disposal a large set of univariate summary statistics. This scheme is related to the simulation experiments performed by McKinley et al (2009). A review discussing dimension reduction methods in ABC has been written by Blum et al (2012).…”
Section: Summary Statisticsmentioning
confidence: 99%
“…Moreover, this problem is compounded whenever insight requires the sweeping of large regions of plausible parameter space, or when the results of simulation are required for parameter inference, e.g. when using an approximate bayesian computation (ABC) method [25,26,27]. The requirement to perform very many independent realisations puts great emphasis on the development of highly e cient algorithms for stochastic simulation, even at the cost of mild inexactness (an example of such an approach is [28]).…”
Section: Introductionmentioning
confidence: 99%