Articles | Volume 8, issue 1
https://doi.org/10.5194/asr-8-45-2012
https://doi.org/10.5194/asr-8-45-2012
28 Mar 2012
 | 28 Mar 2012

The historical pathway towards more accurate homogenisation

P. Domonkos, V. Venema, I. Auer, O. Mestre, and M. Brunetti

Abstract. In recent years increasing effort has been devoted to objectively evaluate the efficiency of homogenisation methods for climate data; an important effort was the blind benchmarking performed in the COST Action HOME (ES0601). The statistical characteristics of the examined series have significant impact on the measured efficiencies, thus it is difficult to obtain an unambiguous picture of the efficiencies, relying only on numerical tests. In this study the historical methodological development with focus on the homogenisation of surface temperature observations is presented in order to view the progress from the side of the development of statistical tools. The main stages of this methodological progress, such as for instance the fitting optimal step-functions when the number of change-points is known (1972), cutting algorithm (1995), Caussinus – Lyazrhi criterion (1997), are recalled and their effects on the quality-improvement of homogenisation is briefly discussed. This analysis of the theoretical properties together with the recently published numerical results jointly indicate that, MASH, PRODIGE, ACMANT and USHCN are the best statistical tools for homogenising climatic time series, since they provide the reconstruction and preservation of true climatic variability in observational time series with the highest reliability. On the other hand, skilled homogenizers may achieve outstanding reliability also with the combination of simple statistical methods such as the Craddock-test and visual expert decisions. A few efficiency results of the COST HOME experiments are presented to demonstrate the performance of the best homogenisation methods.