Abstract. The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added.Published by Copernicus Publications on behalf of the European Geosciences Union. V. K. C. Venema et al.: Benchmarking monthly homogenization algorithmsParticipants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training the users on homogenization software was found to be very important. Moreover, stateof-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.
Homogenization methods are developed to reduce the impact of non-climatic factors on climate series. Martínex et al. (2009), (International Journal of Climatology, Doi 10.1002/joc.1884) applied a set of homogenization procedures to available Spanish temperature series. In this report, we address critical issues of that paper concerning a specific property of the standard normal homogeneity test and the application scheme of the homogenization tests. We conclude with some important recommendations on the application of homogenization methodologies. Copyright After a reliable quality control procedure, they carried out a set of four different homogeneity tests: (1) standard normal homogeneity test (SNHT; Alexandersson, 1986;Alexandersson and Moberg, 1997a), (2) Buishand range (Buishand, 1982), (3) Pettitt test (Pettitt, 1979) and (4) Von Neumann ratio test (Von Neumann, 1941). The authors did not correct series but decided to reject them in case of inhomogeneity detection. In the description of the homogenization procedure, there are a few misleading points that we would like to discuss in this short comment.The thorough understanding of the behaviour of homogeneity tests and their correct application to climatic time series preserve the climatic signal and eliminate or reduce the influence of non-climatic factors. The removal of false detected inhomogeneities and the acceptance of inhomogeneous series affect each subsequent analysis (e.g. trend assessments, extreme analysis). Therefore, it is of major In the following paragraphs, a brief description of SNHT behaviour is provided showing that SNHT performance decays for breaks located at the beginning and the end of series.There are several studies that have investigated the strengths and weakness of break detection algorithms. For instance, Alexandersson and Moberg (1997a), since because the exact distribution of the test statistic under the null hypothesis is unknown, reported on critical levels of the SNHT statistic for series with a number of values from 10 to 250. Khaliq and Ouarda (2007) extended these critical values from 10 to 50 000. Furthermore, Alexandersson and Moberg (1997b) avoided the application of SNHT to segments with a length less than ten values. Ducré-Robitaille et al. (2003) analysed the behaviour of eight techniques for break detection with simulated
ABSTRACT:The main objective of this study is to estimate the probable maximum precipitation (PMP) in Barcelona for durations ranging from 5 min to 30 h. To this end, rain records from the Jardí gauge of the Fabra Observatory located in Barcelona (1927Barcelona ( -1992) and the urban pluviometric network supported by Clavegueram de Barcelona, S.A. (CLABSA, 1994(CLABSA, -2007 were analysed. Two different techniques were used and compared: a physical method based on the maximization of actual storms, and the Hershfield' statistical method. The PMP values obtained using the two techniques are very similar. In both cases, the expected increasing behaviour of the PMP with duration was found, with the increase especially notable for the mesoscale durations 2-9 h, and not significant from 12 h on up. This result seems to be related to the scale of the meteorological situations producing high intense rainfall amounts over our territory.
Detection and reconstruction of early instrumental series is an interdisciplinary activity that allows us to extend climate data records to periods prior to the mid-19th century, extending the overlapping periods with climate proxies and characterizing extreme events. In this work, the collection of several data sources corresponding to different periods and locations, obtained with a wide range of methods and instruments by institutions or private observers, provides the following results: Barcelona has a continuous rainfall series with monthly resolution since 1786 and with daily resolution since 1850. It is worth mentioning that the records from Barcelona provide the longest continuous monthly series available on rainfall in the Iberian Peninsula. The monthly records have been homogenized by using a relative homogenization approach, HOMER. The results highlight the existence of five breaks, most of them due to relocations or instrumentation changes documented in the metadata, which have been adjusted to remove non-climatic factors. The homogenized annual and winter precipitation series in Barcelona show a statistically significant increase from 1786 to 2014, although this increase is mainly due to the concentration of negative anomalies during the first half of the 19th century, which is also clearly visible in the seasonal series. Specifically, an extreme mega-drought episode was observed from the 1810s to the 1830s, which is supported by different proxy data. For a better dissemination of the homogenized monthly series developed in this study, the data set is freely available to the research community.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.