Loading...
Image to image models for big, non-stationary spatial data
Sikorski, Antony ; Ivanitskiy, Michael ; McKenzie, Daniel ; Nychka, Douglas
Sikorski, Antony
Ivanitskiy, Michael
McKenzie, Daniel
Nychka, Douglas
Citations
Altmetric:
Advisor
Editor
Date
Date Issued
2025-04
Date Submitted
Keywords
Files
Research Projects
Organizational Units
Journal Issue
Embargo Expires
Abstract
The ability to draw important conclusions regarding Earth and climate science problems is often limited by insufficient quantification of uncertainty in non-stationary, gridded data from climate models and satellites. The extreme computational demands of climate models make it infeasible to generate large ensembles, while satellites often struggle with measurement errors, computational artifacts, and extended revisit times. A solution to this problem is to model the data using a statistical distribution, allowing for efficient simulation of additional samples. However, fitting the model is often challenging due to key parameters varying across the domain (non-stationarity) and traditional methods becoming infeasible as the size of the data increases. Recent deep learning approaches address the computational costs by segmenting the data and performing parameter estimation locally. While this is a significant improvement, this strategy limits the ability to capture long-range correlation patterns, which are often exhibited by physical phenomena such as jet streams and along coastlines. We address this problem by adapting image-to-image models such as Vision Transformers, U-nets, and hybrids of the two to the task of non-stationary parameter estimation. Once the models have been trained to estimate the parameter grids, we rapidly simulate ensembles with a spatial autoregressive (SAR) model, allowing for pixel-wise uncertainty quantification. As an example, we apply this framework to analyze global surface temperature fields from a climate model dataset.
Associated Publications
Rights
Copyright of the original work is retained by the author.
