2013
DOI: 10.1016/j.ascom.2013.06.003
|View full text |Cite
|
Sign up to set email alerts
|

CosmoHammer: Cosmological parameter estimation with the MCMC Hammer

Abstract: We study the benefits and limits of parallelised Markov chain Monte Carlo (MCMC) sampling in cosmology. MCMC methods are widely used for the estimation of cosmological parameters from a given set of observations and are typically based on the Metropolis-Hastings algorithm. Some of the required calculations can however be computationally intensive, meaning that a single long chain can take several hours or days to calculate. In practice, this can be limiting, since the MCMC process needs to be performed many ti… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
96
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 96 publications
(96 citation statements)
references
References 14 publications
0
96
0
Order By: Relevance
“…Convergence of the samples was ensured by letting the random walks proceed for multiple integrated auto-correlation times τ int after removing all samples drawn during the initial burn-in period. This procedure was advocated by Akeret et al (2013) and Allison & Dunkley (2014), who provided a detailed discussion of convergence diagnostics with the auto-correlation times described above.…”
Section: Efficient Statistical Samplingmentioning
confidence: 99%
See 1 more Smart Citation
“…Convergence of the samples was ensured by letting the random walks proceed for multiple integrated auto-correlation times τ int after removing all samples drawn during the initial burn-in period. This procedure was advocated by Akeret et al (2013) and Allison & Dunkley (2014), who provided a detailed discussion of convergence diagnostics with the auto-correlation times described above.…”
Section: Efficient Statistical Samplingmentioning
confidence: 99%
“…The time evolution of this ensemble can easily be parallelised, which greatly reduces the required computing time (wall-clock time) on multi-core machines or large computing clusters (Akeret et al 2013). Foreman-Mackey et al (2013) provided an excellent discussion of the stretch-move technique and described a parallelised implementation in detail.…”
Section: Efficient Statistical Samplingmentioning
confidence: 99%
“…This choice enables us to fully characterise any potential model degeneracies between our model parameters, while also providing the individual probability distribution functions (PDFs) for each model parameter. In this work, we utilise the publicly available MCMC python code CosmoHammer (Akeret et al 2013) built upon EMCEE (Foreman-Mackey et al 2013) which is based on the affine invariant MCMC sampler (Goodman & Weare 2010).…”
Section: Mcmc Sampling the Qso Templatementioning
confidence: 99%
“…The standard approach to this is the Metropolis-Hastings algorithm (Metropolis et al 1953;Hastings 1970). In the next section we will describe the application of a powerful new parallel sampling technique (Goodman & Weare 2010;Foreman-Mackey et al 2013;Akeret et al 2013) that we can apply to this problem. If we are only interested in the set of weights that maximize the posterior, we take the negative log of the posterior, discard all constant terms, and minimize…”
Section: The Linear Problemmentioning
confidence: 99%
“…Unfortunately, finding global extrema in a high dimensional space is a formidable problem. Our approach is to begin by sampling the posterior using the method introduced by Goodman & Weare (2010) and implemented by Foreman-Mackey et al (2013) and Akeret et al (2013), among others. This method differs from the traditional single Monte Carlo Markov chain random walk method for sampling the posterior by utilizing an ensemble of walkers, potentially thousands of them, that can be updated in parallel.…”
Section: The Linear Problemmentioning
confidence: 99%