2023
DOI: 10.1214/23-ss144
|View full text |Cite
|
Sign up to set email alerts
|

Nested sampling methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
21
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 40 publications
(21 citation statements)
references
References 74 publications
0
21
0
Order By: Relevance
“…The algorithm returns a set of nested samples, with corresponding prior volumes and likelihoods, and an evidence estimate with a corresponding error. The stopping criterion is typically related to the fractional change in the evidence between iterations [7].…”
Section: Nested Samplingmentioning
confidence: 99%
See 2 more Smart Citations
“…The algorithm returns a set of nested samples, with corresponding prior volumes and likelihoods, and an evidence estimate with a corresponding error. The stopping criterion is typically related to the fractional change in the evidence between iterations [7].…”
Section: Nested Samplingmentioning
confidence: 99%
“…In the original paper [2], Skilling proposes using MCMC over the prior and accepting only those points for which L(θ) > L * until the correlation with the starting point (one of the existing samples) has been lost. This method requires a random walk that can adapt to the continuously shrinking likelihood-constrained prior and a method for determining the number of steps to take [7]. Further modifications are often needed to handle multi-modality and complex correlations between parameters, for example, as implemented in Veitch et al [3].…”
Section: Nested Samplingmentioning
confidence: 99%
See 1 more Smart Citation
“…We employ the Bayesian inference method of nested sampling (Skilling 2004; for reviews see Ashton et al 2022;Buchner 2023). Conceptually, nested sampling algorithms converge on the best parameter estimates by iteratively removing regions of the prior volume with lower likelihood.…”
Section: Nested Samplingmentioning
confidence: 99%
“…In general, Monte Carlo methods like nested sampling and MCMC are computationally expensive because they require many likelihood evaluations to sample the converged posterior distributions. We choose nested sampling instead of MCMC because the former is designed to better constrain complex parameter spaces with banana-shaped curved degeneracies and/or multimodal distributions (Buchner 2023). Another likelihood-based method that has been applied to parameter estimation of the global 21 cm signal is Fisher matrix analysis (Liu et al 2013;Muñoz et al 2020;Hibbard et al 2022;Mason et al 2023a), which assumes multivariate Gaussian posterior distributions and requires only ( )  N likelihood evaluations for N parameters being sampled.…”
Section: Nested Samplingmentioning
confidence: 99%