2008
DOI: 10.1098/rsta.2008.0192
|View full text |Cite
|
Sign up to set email alerts
|

Creating synthetic universes in a computer

Abstract: Cosmologists regularly generate synthetic universes of galaxies using computer simulations. Such catalogues have an essential role to play in the analysis and exploitation of current and forthcoming galaxy surveys. I review the different ways in which synthetic or 'mock' catalogues are produced and discuss the different physical processes that the models attempt to follow, explaining why it is important to be as realistic as possible when trying to forge the Universe.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
15
0

Year Published

2008
2008
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(15 citation statements)
references
References 37 publications
(44 reference statements)
0
15
0
Order By: Relevance
“…For a clustering analysis of an observed galaxy catalogue, the demands on the Monte Carlo approach are even greater as the N ‐body simulations need to be populated with galaxies, according to some prescription, with the goal of statistically reproducing the galaxy sample as faithfully as possible (see e.g. Baugh 2008 for a review of the techniques used to build synthetic galaxy catalogues).…”
Section: Review Of Error Estimation Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…For a clustering analysis of an observed galaxy catalogue, the demands on the Monte Carlo approach are even greater as the N ‐body simulations need to be populated with galaxies, according to some prescription, with the goal of statistically reproducing the galaxy sample as faithfully as possible (see e.g. Baugh 2008 for a review of the techniques used to build synthetic galaxy catalogues).…”
Section: Review Of Error Estimation Methodsmentioning
confidence: 99%
“…This approach will inevitably involve N ‐body simulations, in order to accurately model the underlying dark matter, and so becomes computationally expensive. Various techniques have been adopted in the literature to populate such simulations with galaxies (for a survey see Baugh 2008). The number of simulations required is large, running into several tens to get reasonable estimates of the variance and hundreds or even thousands to get accurate covariance matrices.…”
Section: Introductionmentioning
confidence: 99%
“…As a result, the RS method must be tested and calibrated. At low redshift this can be done with spectroscopic data sets, but at higher redshifts, where the spectroscopic data are sparse, one must turn to using synthetic 'mock' catalogues based upon the latest galaxy formation models (Baugh 2008).…”
Section: Introductionmentioning
confidence: 99%
“…The principal way to study errors on clustering measurements from galaxy surveys is through an accurate model of the experiment itself (Baugh 2008). For the case of relevance here (the spatial distribution of galaxies), this is optimally achieved in a three step process.…”
Section: Introductionmentioning
confidence: 99%