The range of possibilities for future climate evolution needs to be taken into account when planning climate change mitigation and adaptation strategies. This requires ensembles of multi-decadal simulations to assess both chaotic climate variability and model response uncertainty. Statistical estimates of model response uncertainty, based on observations of recent climate change, admit climate sensitivities--defined as the equilibrium response of global mean temperature to doubling levels of atmospheric carbon dioxide--substantially greater than 5 K. But such strong responses are not used in ranges for future climate change because they have not been seen in general circulation models. Here we present results from the 'climateprediction.net' experiment, the first multi-thousand-member grand ensemble of simulations using a general circulation model and thereby explicitly resolving regional details. We find model versions as realistic as other state-of-the-art climate models but with climate sensitivities ranging from less than 2 K to more than 11 K. Models with such extreme sensitivities are critical for the study of the full range of possible responses of the climate system to rising greenhouse gas levels, and for assessing the risks associated with specific targets for stabilizing these levels.
Interest in attributing the risk of damaging weather-related events to anthropogenic climate change is increasing. Yet climate models used to study the attribution problem typically do not resolve the weather systems associated with damaging events such as the UK floods of October and November 2000. Occurring during the wettest autumn in England and Wales since records began in 1766, these floods damaged nearly 10,000 properties across that region, disrupted services severely, and caused insured losses estimated at £1.3 billion (refs 5, 6). Although the flooding was deemed a 'wake-up call' to the impacts of climate change at the time, such claims are typically supported only by general thermodynamic arguments that suggest increased extreme precipitation under global warming, but fail to account fully for the complex hydrometeorology associated with flooding. Here we present a multi-step, physically based 'probabilistic event attribution' framework showing that it is very likely that global anthropogenic greenhouse gas emissions substantially increased the risk of flood occurrence in England and Wales in autumn 2000. Using publicly volunteered distributed computing, we generate several thousand seasonal-forecast-resolution climate model simulations of autumn 2000 weather, both under realistic conditions, and under conditions as they might have been had these greenhouse gas emissions and the resulting large-scale warming never occurred. Results are fed into a precipitation-runoff model that is used to simulate severe daily river runoff events in England and Wales (proxy indicators of flood events). The precise magnitude of the anthropogenic contribution remains uncertain, but in nine out of ten cases our model results indicate that twentieth-century anthropogenic greenhouse gas emissions increased the risk of floods occurring in England and Wales in autumn 2000 by more than 20%, and in two out of three cases by more than 90%.
Demonstrating the effect that climate change is having on regional weather is a subject which occupies climate scientists, government policy makers and the media. After an extreme weather event occurs, the question is often posed, 'Was the event caused by anthropogenic climate change?' Recently, a new branch of climate science (known as attribution) has sought to quantify how much the risk of extreme events occurring has increased or decreased due to climate change. One method of attribution uses very large ensembles of climate models computed via volunteer distributed computing. A recent advancement is the ability to run both a global climate model and a higher resolution regional climate model on a volunteer's home computer. Such a set-up allows the simulation of weather on a scale that is of most use to studies of the attribution of extreme events. This article introduces a global climate model that has been developed to simulate the climatology of all major land regions with reasonable accuracy. This then provides the boundary conditions to a regional climate model (which uses the same formulation but at higher resolution) to ensure that it can produce realistic climate and weather over any region of choice. The development process is documented and a comparison to previous coupled climate models and atmosphere-only climate models is made. The system (known as weather@home) by which the global model is coupled to a regional climate model and run on volunteers' home computers is then detailed. Finally, a validation of the whole system is performed, with a particular emphasis on how accurately the distributions of daily mean temperature and daily mean precipitation are modelled in a particular application over Europe. This builds confidence in the applicability of the weather@home system for event attribution studies.
properties controlling the twenty-first century response to sustained 31 anthropogenic greenhouse-gas forcing were not fully sampled, 32 partially owing to a correlation between climate sensitivity and 33 aerosol forcing 7,8 , a tendency to overestimate ocean heat uptake 11,12 34 and compensation between short-wave and long-wave feedbacks 9 . 35This complicates the interpretation of the ensemble spread as Fig. S1).
In complex spatial models, as used to predict the climate response to greenhouse gas emissions, parameter variation within plausible bounds has major effects on model behavior of interest. Here, we present an unprecedentedly large ensemble of >57,000 climate model runs in which 10 parameters, initial conditions, hardware, and software used to run the model all have been varied. We relate information about the model runs to large-scale model behavior (equilibrium sensitivity of global mean temperature to a doubling of carbon dioxide). We demonstrate that effects of parameter, hardware, and software variation are detectable, complex, and interacting. However, we find most of the effects of parameter variation are caused by a small subset of parameters. Notably, the entrainment coefficient in clouds is associated with 30% of the variation seen in climate sensitivity, although both low and high values can give high climate sensitivity. We demonstrate that the effect of hardware and software is small relative to the effect of parameter variation and, over the wide range of systems tested, may be treated as equivalent to that caused by changes in initial conditions. We discuss the significance of these results in relation to the design and interpretation of climate modeling experiments and large-scale modeling more generally.classification and regression trees ͉ climate change ͉ distributed computing ͉ general circulation models ͉ sensitivity analysis S imulation with complex mechanistic spatial models is central to science from the level of molecules (1) via biological systems (2, 3) to global climate (4). The objective typically is a mechanistically based prediction of system-level behavior. However, both through incomplete knowledge of the system simulated and the approximations required to make such models tractable, the ''true'' or ''optimal'' values of some model parameters necessarily will be uncertain. A limiting factor in such simulations is the availability of computational resources. Thus, combinations of plausible parameter values rarely are tested, leaving the dependence of conclusions on the particular parameters chosen unknown.Observations of the modeled system are vital for model verification and analysis, e.g., turning model output into probabilistic predictions of real-world system behavior (5-7). However, typically, few observations are available relative to the complexity of the model. There also may be little true replicate data available. For instance, there can be only one observational time series for global climate. Thus, if the same observations are used to fit parameter values, there is a severe risk of overfitting, gaining limited verisimilitude at the cost of the mechanistic insight and predictive ability for which the model originally was designed.To avoid fitting problems, parameter estimates must be refined directly. In some biological systems, direct and simultaneous measurement of large numbers of system parameters (e.g., protein binding or catalytic constants) soon may be possible. I...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.