2011
DOI: 10.5194/asr-6-7-2011
|View full text |Cite
|
Sign up to set email alerts
|

Newest developments of ACMANT

Abstract: Abstract. The seasonal cycle of radiation intensity often causes a marked seasonal cycle in the inhomogeneities (IHs) of observed temperature time series, since a substantial portion of them have direct or indirect connection to radiation changes in the micro-environment of the thermometer. Therefore the magnitudes of temperature IHs tend to be larger in summer than in winter. A new homogenisation method, the Adapted Caussinus -Mestre Algorithm for Networks of Temperature series (ACMANT) has recently been deve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
22
0
2

Year Published

2012
2012
2020
2020

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(24 citation statements)
references
References 12 publications
0
22
0
2
Order By: Relevance
“…However, a climatologist may want to know to what degree decadal variability and trends in homogenized data may be due to remaining small inhomogeneities. To be able to answer such questions requires an evaluation of the output of full homogenization methods in terms of other statistical metrics, for instance the remaining error in linear trend estimates and the mean square error between the true time series and the homogenized ones (Domonkos, 2008;Domonkos et al, 2011). For these errors to be applicable to real datasets and to be able to perform a benchmarking of homogenization algorithms, the structure of the artificial data and its inserted inhomogeneities should be realistic.…”
Section: Introductionmentioning
confidence: 99%
“…However, a climatologist may want to know to what degree decadal variability and trends in homogenized data may be due to remaining small inhomogeneities. To be able to answer such questions requires an evaluation of the output of full homogenization methods in terms of other statistical metrics, for instance the remaining error in linear trend estimates and the mean square error between the true time series and the homogenized ones (Domonkos, 2008;Domonkos et al, 2011). For these errors to be applicable to real datasets and to be able to perform a benchmarking of homogenization algorithms, the structure of the artificial data and its inserted inhomogeneities should be realistic.…”
Section: Introductionmentioning
confidence: 99%
“…To be able to answer these questions it is required an evaluation of the output of full homogenization methods on an artificial data with known inhomogeneities randomly inserted (Domonkos, 2008;Domonkos et al, 2011). The inserted inhomogeneities range from simple one-break cases to cases with a very complete and realistic description of the inhomogeneities, including platform-like inhomogeneities in which after the first break there is soon a second break in the opposite direction (Venema et al, 2006a).…”
Section: Homogenizationmentioning
confidence: 99%
“…HOMER (HOMogenizaton softwarE in R) is a software for homogenizing essential climate variables at monthly and annual time scales. HOMER has been constructed exploiting the best characteristics of some other state-of-the-art homogenization methods, i.e., PRODIGE (Caussinus and Mestre, 2004), ACMANT (Domonkos, 2011), CLIMATOL (Guijarro, 2014) and the recently developed joint-segmentation method (cghseg) (Picard et al, 2011). HOMER is based on the methodology of optimal segmentation with dynamic programming, the application of a network-wide two factor model both for detection and correction, and some new techniques in the coordination of detection processes from multi annual to monthly scales.…”
Section: Homogenizationmentioning
confidence: 99%
“…Easterling and Peterson, 1995) or lacking real-world complexity of both climate variability and inhomogeneity characteristics (e.g. Vincent, 1998;Ducré-Robitaille et al, 2003;Reeves et al, 2007;Wang, 2008a, b). A relatively comprehensive but regionally limited study is that of Begert et al (2008), who used the manually homogenised Swiss network as a test case.…”
Section: Introductionmentioning
confidence: 99%
“…Lyazrhi, 1997, Lu et al, 2010;Hannart and Naveau, 2012;Lindau and Venema, 2013) and the presence of change points within the reference series used in relative homogenisation (e.g. Caussinus and Mestre, 2004;Williams, 2005, 2009;Domonkos et al, 2011) clearly performed best in the HOME benchmark.…”
Section: Introductionmentioning
confidence: 99%