Ordinary baryonic particles (such as protons and neutrons) account for only one-sixth of the total matter in the Universe. The remainder is a mysterious "dark matter" component, which does not interact via electromagnetism and thus neither emits nor reflects light. As dark matter cannot be seen directly using traditional observations, very little is currently known about its properties. It does interact via gravity, and is most effectively probed through gravitational lensing: the deflection of light from distant galaxies by the gravitational attraction of foreground mass concentrations. This is a purely geometrical effect that is free of astrophysical assumptions and sensitive to all matter -- whether baryonic or dark. Here we show high fidelity maps of the large-scale distribution of dark matter, resolved in both angle and depth. We find a loose network of filaments, growing over time, which intersect in massive structures at the locations of clusters of galaxies. Our results are consistent with predictions of gravitationally induced structure formation, in which the initial, smooth distribution of dark matter collapses into filaments then into clusters, forming a gravitational scaffold into which gas can accumulate, and stars can be built.Comment: 10 pages, 5 figures... plus 6 pages, 7 figures of supplementary information. An officially typset version is available from the Nature website at http://www.nature.com/nature/journal/v445/n7125/abs/nature05497.html and the front cover of Nature featuring the mass map is available in PDF format from http://www.astro.caltech.edu/~rjm/publicity/Nature180107_cover.pd
The first half of this paper explores the origin of systematic biases in the measurement of weak gravitational lensing. Compared to previous work, we expand the investigation of PSF instability and fold in for the first time the effects of non-idealities in electronic imaging detectors and imperfect galaxy shape measurement algorithms. Together, these now explain the additive A( ) and multiplicative M( ) systematics typically reported in current lensing measurements. We find that overall performance is driven by a product of a telescope/camera's absolute performance, and our knowledge about its performance.The second half of this paper propagates any residual shear measurement biases through to their effect on cosmological parameter constraints. Fully exploiting the statistical power of Stage IV weak lensing surveys will require additive biases A < ∼ 1.8× 10 −12 and multiplicative biases M < ∼ 4.0 × 10 −3 . These can be allocated between individual budgets in hardware, calibration data and software, using results from the first half of the paper.If instrumentation is stable and well-calibrated, we find extant shear measurement software from GREAT10 already meet requirements on galaxies detected at S/N=40. Averaging over a population of galaxies with a realistic distribution of sizes, it also meets requirements for a 2D cosmic shear analysis from space. If used on fainter galaxies or for 3D cosmic shear tomography, existing algorithms would need calibration on simulations to avoid introducing bias at a level similar to the statistical error. Requirements on hardware and calibration data are discussed in more detail in a companion paper. Our analysis is intentionally general, but is specifically being used to drive the hardware and ground segment performance budget for the design of the European Space Agency's recently-selected Euclid mission.
We have performed a 70 billion dark-matter particles N-body simulation in a 2 h −1 Gpc periodic box, using the concordance, cosmological model as favored by the latest WMAP3 results. We have computed a full-sky convergence map with a resolution of Δθ 0.74 arcmin 2 , spanning 4 orders of magnitude in angular dynamical range. Using various high-order statistics on a realistic cut sky, we have characterized the transition from the linear to the nonlinear regime at 1000 and shown that realistic galactic masking affects high-order moments only below < 200. Each domain (Gaussian and non-Gaussian) spans 2 decades in angular scale. This map is therefore an ideal tool for testing map-making algorithms on the sphere. As a first step in addressing the full map reconstruction problem, we have benchmarked in this paper two denoising methods: 1) Wiener filtering applied to the Spherical Harmonics decomposition of the map and 2) a new method, called MRLens, based on the modification of the Maximum Entropy Method on a Wavelet decomposition. While the latter is optimal on large spatial scales, where the signal is Gaussian, MRLens outperforms the Wiener method on small spatial scales, where the signal is highly non-Gaussian. The simulated full-sky convergence map is freely available to the community to help the development of new map-making algorithms dedicated to the next generation of weak-lensing surveys.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.