No abstract
The rate at which pure initial states deteriorate into mixtures is computed for a harmonic oscillator interacting with an environment in thermal equilibrium. The decoherence process resulting from this interaction selects a set of states characterized by maximal stability (or minimal loss of predictive power) which can be quantified by the rate of increase in either linear or statistical entropy. In the weak cou- It was shown for various idealized models (invented primarily to study the "reduction of the state vector" in the context of idealized measurements) that, when the environment -that is, degrees of freedom which interact with the record-keeping "pointer" of a quantum apparatus -is taken into account, the vast majority of pure states become in effect inaccessible [7][8][9]. This is because a continuous interaction with the environment destroys, on a very short decoherence trme scale, the purity of nearly all of the initial superpositions. Thus, only the observations which refer to a preferred set of stable states or the associated set of observables will exhibit one of the key attributes of "classical reality, " -the predictive power of the associated records.Such an interaction can be thought of as a continuous monitoring of the macroscopic quantum system by the environment.Neglecting the self-Hamiltonian of the sys-
Prepared by the LSST Science Collaborations, with contributions from the LSST Project. PrefaceMajor advances in our understanding of the Universe over the history of astronomy have often arisen from dramatic improvements in our ability to observe the sky to greater depth, in previously unexplored wavebands, with higher precision, or with improved spatial, spectral, or temporal resolution. Aided by rapid progress in information technology, current sky surveys are again changing the way we view and study the Universe, and the next-generation instruments, and the surveys that will be made with them, will maintain this revolutionary progress. Substantial progress in the important scientific problems of the next decade (determining the nature of dark energy and dark matter, studying the evolution of galaxies and the structure of our own Milky Way, opening up the time domain to discover faint variable objects, and mapping both the inner and outer Solar System) all require wide-field repeated deep imaging of the sky in optical bands.The wide-fast-deep science requirement leads to a single wide-field telescope and camera which can repeatedly survey the sky with deep short exposures. The Large Synoptic Survey Telescope (LSST), a dedicated telecope with an effective aperture of 6.7 meters and a field of view of 9.6 deg 2 , will make major contributions to all these scientific areas and more. It will carry out a survey of 20,000 deg 2 of the sky in six broad photometric bands, imaging each region of sky roughly 2000 times (1000 pairs of back-to-back 15-sec exposures) over a ten-year survey lifetime.The LSST project will deliver fully calibrated survey data to the United States scientific community and the public with no proprietary period. Near real-time alerts for transients will also be provided worldwide. A goal is worldwide participation in all data products. The survey will enable comprehensive exploration of the Solar System beyond the Kuiper Belt, new understanding of the structure of our Galaxy and that of the Local Group, and vast opportunities in cosmology and galaxy evolution using data for billions of distant galaxies. Since many of these science programs will involve the use of the world's largest non-proprietary database, a key goal is maximizing the usability of the data. Experience with previous surveys is that often their most exciting scientific results were unanticipated at the time that the survey was designed; we fully expect this to be the case for the LSST as well.The purpose of this Science Book is to examine and document in detail science goals, opportunities, and capabilities that will be provided by the LSST. The book addresses key questions that will be confronted by the LSST survey, and it poses new questions to be addressed by future study. It contains previously available material (including a number of White Papers submitted to the ASTRO2010 Decadal Survey) as well as new results from a year-long campaign of study and evaluation. This book does not attempt to be complete; there are many ...
This document on the CMB-S4 Science Case, Reference Design, and Project Plan is the product of a global community of scientists who are united in support of advancing CMB-S4 to cross key thresholds in our understanding of the fundamental nature of space and time and the evolution of the Universe. CMB-S4 is planned to be a joint National Science Foundation (NSF) and Department of Energy (DOE) project, with the construction phase to be funded as an NSF Major Research Equipment and Facilities Construction (MREFC) project and a DOE High Energy Physics (HEP) Major Item of Equipment (MIE) project. At the time of this writing, an interim project office has been constituted and tasked with advancing the CMB-S4 project in the NSF MREFC Preliminary Design Phase and toward DOE Critical Decision CD-1. DOE CD-0 is expected imminently.CMB-S4 has been in development for six years. Through the Snowmass Cosmic Frontier planning process, experimental groups in the cosmic microwave background (CMB) and broader cosmology communities came together to produce two influential CMB planning papers, endorsed by over 90 scientists, that outlined the science case as well as the CMB-S4 instrumental concept [1, 2]. It immediately became clear that an enormous increase in the scale of ground-based CMB experiments would be needed to achieve the exciting thresholdcrossing scientific goals, necessitating a phase change in the ground-based CMB experimental program. To realize CMB-S4, a partnership of the university-based CMB groups, the broader cosmology community, and the national laboratories would be needed.The community proposed CMB-S4 to the 2014 Particle Physics Project Prioritization Process (P5) as a single, community-wide experiment, jointly supported by DOE and NSF. Following P5's recommendation of CMB-S4 under all budget scenarios, the CMB community started in early 2015 to hold biannual workshops -open to CMB scientists from around the world -to develop and refine the concept. Nine workshops have been held to date, typically with 150 to 200 participants. The workshops have focused on developing the unique and vital role of the future ground-based CMB program. This growing CMB-S4 community produced a detailed and influential CMB-S4 Science Book [3] and a CMB-S4 Technology Book [4]. Over 200 scientists contributed to these documents. These and numerous other reports, workshop and working group wiki pages, email lists, and much more may be found at the website http://CMB-S4.org.Soon after the CMB-S4 Science Book was completed in August 2016, DOE and NSF requested the Astronomy and Astrophysics Advisory Committee (AAAC) to convene a Concept Definition Taskforce (CDT) to conduct a CMB-S4 concept study. The resulting report was unanimously accepted in late 2017. 1 One recommendation of the CDT report was that the community should organize itself into a formal collaboration. An Interim Collaboration Coordination Committee was elected by the community to coordinate this process. The resulting draft bylaws were refined at the Spring 2018 CMB-S4...
We present a precise estimate of the bulk virial scaling relation of halos formed via hierarchical clustering in an ensemble of simulated cold dark matter cosmologies. The result is insensitive to cosmological parameters, the presence of a trace, dissipationless gas component, and numerical resolution down to a limit of ∼ 1000 particles. The dark matter velocity dispersion scales with total mass as log(σ DM (M, z)) = log(1082.9 ± 4.0 km s −1 ) + (0.3361 ± 0.0026) log(h(z)M 200 /10 15 M ⊙ ), with h(z) the dimensionless Hubble parameter. At fixed mass, the velocity dispersion likelihood is nearly log-normal, with scatter σ ln σ = 0.0426 ± 0.015, except for a tail to higher dispersions containing 10% of the population that are merger transients. We combine this relation with the halo mass function in ΛCDM models, and show that a low normalization condition, S 8 = σ 8 (Ω m /0.3) 0.35 = 0.69, favored by recent WMAP and SDSS analysis requires that galaxy and gas specific energies in rich clusters be 50% larger than that of the underlying dark matter. Such large energetic biases are in conflict with the current generation of direct simulations of cluster formation. A higher normalization, S 8 = 0.80, alleviates this tension and implies that the hot gas fraction within r 500 is (0.71 ± 0.09)h −3/2 70 Ω b /Ω m , a value consistent with recent Sunyaev-Zel'dovich observations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.