An “offline” approach to DA is used, where static ensemble samples are drawn from existing CMIP climate‐model simulations to serve as the prior estimate of climate variables. We use linear, univariate forward models (“proxy system models (PSMs)”) that map climate variables to proxy measurements by fitting proxy data to 2 m air temperature from gridded instrumental temperature data; the linear PSMs are then used to predict proxy values from the prior estimate. Results for the LMR are compared against six gridded instrumental temperature data sets and 25% of the proxy records are withheld from assimilation for independent verification. Results show broad agreement with previous reconstructions of Northern Hemisphere mean 2 m air temperature, with millennial‐scale cooling, a multicentennial warm period around 1000 C.E., and a cold period coincident with the Little Ice Age (circa 1450–1800 C.E.). Verification against gridded instrumental data sets during 1880–2000 C.E. reveals greatest skill in the tropics and lowest skill over Northern Hemisphere land areas. Verification against independent proxy records indicates substantial improvement relative to the model (prior) data without proxy assimilation. As an illustrative example, we present multivariate reconstructed fields for a singular event, the 1808/1809 “mystery” volcanic eruption, which reveal global cooling that is strongly enhanced locally due to the presence of the Pacific‐North America wave pattern in the 500 hPa geopotential height field.
Abstract. The Last Millennium Reanalysis (LMR) utilizes an ensemble methodology to assimilate paleoclimate data for the production of annually resolved climate field reconstructions of the Common Era. Two key elements are the focus of this work: the set of assimilated proxy records and the forward models that map climate variables to proxy measurements. Results based on an updated proxy database and seasonal regression-based forward models are compared to the LMR prototype, which was based on a smaller set of proxy records and simpler proxy models formulated as univariate linear regressions against annual temperature. Validation against various instrumental-era gridded analyses shows that the new reconstructions of surface air temperature and 500 hPa geopotential height are significantly improved (from 10 % to more than 100 %), while improvements in reconstruction of the Palmer Drought Severity Index are more modest. Additional experiments designed to isolate the sources of improvement reveal the importance of the updated proxy records, including coral records for improving tropical reconstructions, and tree-ring density records for temperature reconstructions, particularly in high northern latitudes. Proxy forward models that account for seasonal responses, and dependence on both temperature and moisture for tree-ring width, also contribute to improvements in reconstructed thermodynamic and hydroclimate variables in midlatitudes. The variability of temperature at multidecadal to centennial scales is also shown to be sensitive to the set of assimilated proxies, especially to the inclusion of primarily moisture-sensitive tree-ring-width records.
Variational methods are widely used to solve geophysical inverse problems. Although gradient-based minimization algorithms are available for high-dimensional problems (dimension > 10 6 ), they do not provide an estimate of the errors in the optimal solution. In this study, we assess the performance of several numerical methods to approximate the analysis-error covariance matrix, assuming reasonably linear models. The evaluation is performed for a CO 2 flux estimation problem using synthetic remote-sensing observations of CO 2 columns. A low-dimensional experiment is considered in order to compare the analysis error approximations to a full-rank finite-difference inverse Hessian estimate, followed by a realistic high-dimensional application. Two stochastic approaches, a MonteCarlo simulation and a method based on random gradients of the cost function, produced analysis error variances with a relative error < 10%. The long-distance error correlations due to sampling noise are significantly less pronounced for the gradient-based randomization, which is also particularly attractive when implemented in parallel. Deterministic evaluations of the inverse Hessian using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm are also tested. While existing BFGS preconditioning techniques yield poor approximations of the error variances (relative error > 120%), a new preconditioner that efficiently accumulates information on the diagonal of the inverse Hessian dramatically improves the results (relative error < 50%). Furthermore, performing several cycles of the BFGS algorithm using the same gradient and vector pairs enhances its performance (relative error < 30%) and is necessary to obtain convergence. Leveraging those findings, we proposed a BFGS hybrid approach which combines the new preconditioner with several BFGS cycles using information from a few (3-5) Monte-Carlo simulations. Its performance is comparable to the stochastic approximations for the low-dimensional case, while good scalability is obtained for the high-dimensional experiment. Potential applications of these new BFGS methods range from characterizing the information content of high-dimensional inverse problems to improving the convergence rate of current minimization algorithms.
Abstract. The Last Millennium Reanalysis utilizes an ensemble methodology to assimilate paleoclimate data for the production of annually resolved climate field reconstructions of the Common Era. Two key elements are the focus of this work: the set of assimilated proxy records, and the forward models that map climate variables to proxy measurements. Results based on an extensive proxy database and seasonal regression-based forward models are compared to the prototype reanalysis of Hakim et al. (2016), which was based on a smaller set of proxy records and simpler proxy models formulated as univariate linear regressions against annual temperature. Validation against various instrumental–era gridded analyses shows that the new reconstructions of surface air temperature, 500 hPa geopotential height and the Palmer Drought Severity Index are significantly improved, with skill scores increasing from 10 % to more than 200 %, depending on the variable and verification measure. Additional experiments designed to isolate the sources of improvement reveal the importance of additional proxy records, including coral records for improving tropical reconstructions; tree-ring-width chronologies, including moisture-sensitive trees, for thermodynamic and hydroclimate variables in mid-latitudes; and tree-ring density records for temperature reconstructions, particularly in high northern latitudes. Proxy forward models that account for seasonal responses, and the dual sensitivity to temperature and moisture characterizing tree-ring-width proxies, are also found to be particularly important. Other experiments highlight the beneficial role of covariance localization on reanalysis ensemble characteristics. This improved paleoclimate data assimilation system served as the basis for the production of the first publicly released NOAA Last Millennium Reanalysis.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.