2022
DOI: 10.1093/mnras/stac3326
|View full text |Cite
|
Sign up to set email alerts
|

matryoshka II: accelerating effective field theory analyses of the galaxy power spectrum

Abstract: In this paper we present an extension to the matryoshka suite of neural-network-based emulators. The new editions have been developed to accelerate EFTofLSS analyses of galaxy power spectrum multipoles in redshift space. They are collectively referred to as the EFTEMU. We test the EFTEMU at the power spectrum level and achieve a prediction accuracy of better than 1% with BOSS-like bias parameters and counterterms on scales 0.001 h Mpc−1 ≤ k ≤ 0.19 h Mpc−1. We also run a series of mock full shape analyses to te… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…This is ideal for sampling posterior probabilities in Bayesian parameter estimation. Our method can be straightforwardly generalised to the multipoles of the spectra, and could also be combined with emulators that make predictions for the true clustering signal (including galaxy biasing) based on perturbation theory (e.g., Donald-McCann et al 2023;DeRose et al 2022;Eggemeier et al 2022). Additional corrections due to binning the theory predictions in exactly the same way as done for the measurements (see e.g., Sect.…”
Section: Discussionmentioning
confidence: 99%
“…This is ideal for sampling posterior probabilities in Bayesian parameter estimation. Our method can be straightforwardly generalised to the multipoles of the spectra, and could also be combined with emulators that make predictions for the true clustering signal (including galaxy biasing) based on perturbation theory (e.g., Donald-McCann et al 2023;DeRose et al 2022;Eggemeier et al 2022). Additional corrections due to binning the theory predictions in exactly the same way as done for the measurements (see e.g., Sect.…”
Section: Discussionmentioning
confidence: 99%
“…X lie = X and y lie = y [11] repeat M times [12] find x add = argmax[a(x)] starting from n r,acq starting locations [13] X lie append x add and X new append x add [14] y lie append µ(x add ) Kriging believer [15] GP_fit(X lie , y lie ) [16] end [17] y true = log L(X new ) + log π(X new ) parallelizable [18] X append X new [19] y append y true [20] if is_converged (e.g. equation (4.4)) then break [21] end [22] Sample µ(x) with MC sampler [23] return MC sample [24] Function GP_fit(X, y) [25] Compute K −1 = k(X, X|θ MAP ) −1 matrix inversion [26] µ(x) = µ GP+SVM (x) equations (2.5) and (3.4) [27] σ(x) = Σ GP+SVM (x) equations (2.6) and (3.5)…”
Section: Jcap10(2023)021mentioning
confidence: 99%
“…On an average single-core CPU, the model can produce ∼ 5000 predictions in a second. The prediction speed of emulators is key to enabling fast cosmological inference with Monte Carlo techniques which require 10 5 -10 6 evaluations [66][67][68].…”
Section: Emulatormentioning
confidence: 99%