Loading...
Simultaneous treatment of random and systematic errors in the historical radiosonde temperature archive
Browning, Joshua M.
Browning, Joshua M.
Citations
Altmetric:
Advisor
Editor
Date
Date Issued
2015
Date Submitted
Collections
Research Projects
Organizational Units
Journal Issue
Embargo Expires
Abstract
The historical radiosonde temperature archive, and indeed any large and lengthy observational dataset, must be quality controlled before it can be used properly. Most research on quality control for such data focuses on the identification and removal of either systematic errors or random errors without considering an optimal process for treatment of both. Additionally, little has been done to evaluate homogenization methods that identify and correct systematic errors when applied to sub-daily data, and no research exists on using robust estimators in homogenization procedures. In this paper, we simulate realistic radiosonde temperature data and contaminate it with both systematic and random errors. We then evaluate (1) the performance of several homogenization algorithms, (2) the influence of removing seasonality, and (3) the sequence in which the random and systematic errors are identified and corrected. We introduce a robust Standard Normal Homogeneity Test (SNHT) and find in simulations that it performs better than the traditional SNHT, and it is better than several other modern alternatives. Moreover, we find that systematic errors present in the data lead to poorer performance of random error removal algorithms, but the presence of random errors is not as detrimental to the robust SNHT homogenization algorithm.
Associated Publications
Rights
Copyright of the original work is retained by the author.