Scholar picked up on two spatio-temporal papers this week, both published in Statistical Science, which I believe are worth mentioning. The first is by Wikle, Milliff, Herbei and Leeds whilst the second is by Diggle, Moraga, Rowlingson and Taylor. In this post I will say some comments on the first work, which talks about BHMs in environmental sciences. The second one is an excellent review on point processes but I will not review it in detail here.
Modern Statistical Methods in Oceanography: A hierarchical perspective, by CK Wikle, RF Milliff, R Herbei and WB Leeds
This article considers the application of Bayesian hierarchical models (BHMs) in the context of oceanography, a field in which the authors have extensive experience. This is meant to be a review article, so if you are already familiar with the authors’ works there will be nothing particularly new here. However this article summarises very effectively, in one place, the statistical developments in this field in the last 15-20 years.
The article starts off by describing why oceanography is such a complex problem, and I find that many problems described here pertain to most environmental applications. For example:
Although it is the case that there are too few in situ observations of the ocean to characterize its evolution and its interaction with marine ecosystems, in an ironic twist the discipline also suffers from having an abundance of particular data types when one factors in the the satellite observations […]
This problem is also what is being faced in glaciology at the moment. Satellites can provide a seeming endless stream of data, however all of the same type, and rarely provide the whole picture. There is a lesson to be learnt here, that satellites, for all their importance in today’s environmental research arena, is no substitute for in situ data which can provide some mixture diversity. This is the only way multi-variate processes, such as that being considered by the authors at present, can be resolved.
The article proceeds to outline the hierarchical standard modelling paradigm before discussing three key application areas.
The first, data assimilation and inverse modelling, is the process by which numerical model outputs are combined with data to provide a consensus estimate (I avoid the word optimal intentionally). The application here is that of modelling surface vector winds (SVW) and the second column on Pg 472 does a good job in highlighting an important point, that today we no longer work with data but with data products. This ongoing concern to the statistician, also highlighted by Peter Guttorp at the last CliMathNet Conference, is what I believe deserves more attention in ST-stats. Remote sensors rarely measure the quantity of interest directly, rather they are used as proxy measurements and several “corrections” are applied in order for the measurement to reflect the quantity of interest. With SVW, the correction procedure seems relatively straightforward, however this is not always the case, for example with geodetic satellites (see last paragraph). As statisticians we are generally provided with “adjusted data” with some measure uncertainty but it’s important to be mindful of where the data and the associated uncertainties actually came from. The application example stresses some important examples of sceince-based parameterisations, for which more information can be found in this review article by two of these authors. Also, later on (in Section 4) the authors stress that:
In general, to be effective in this context, dimension reduction should not be independent of the physical and biological environment under consideration.
The second example of data assimilation in this work focuses on oceanographic tracer data for modelling ocean circulation. Personally I wasn’t aware of this work and found it interesting. The key idea is that the tracer’s stationary pattern is a solution to the advection-diffusion equation
with some Dirichlet boundary conditions. One solves for this on a grid, repeatedly for different parameters (in this case the velocity , the diffusivity
, boundary terms and the sink constant
), and “compare” the solution for C with observations at respective locations (using the likelihood function). MCMC is used in practice, although I imagine solving for
for each sample is quite computationally intensive.
The second application area is long-lead forecasting for sea-surface temperatures. The authors claim that
such “long-lead” forecasts have shown useful skill and correspond to one of the few situations in ocean science where a purely statistical forecast methodology is competitive with, and in may cases better than, equivalent deterministic model forecasts
Indeed, this work for me is a flagship spatio-temporal model and the original work of Mark Berliner is definitely worth a read for those who haven’t done so yet. The basic idea is to split the spatio-temporal model into correlated large-scale components (which I believe where EOFs in this case), and small-scale temporally independent components, but also to extensively use scientific insight. For example, the propagation matrix was set to vary according to the “climate regime” of the temporal year in question, depending on whether it was cool, normal or warm. This, clearly, is an educated model choice based on underlying physical considerations, and such inter-disciplinary thinking goes a long way in improving predictive assessments.
The third application considered is a relatively new one by William Leeds, who uses BHMs for studying marine ecosystems. The reader should read this paper and this paper as they are some of the few works which combines BHMs with emulators. Emulation is the process by which complex numerical models are simplified as stochastic models which can be evaluated (i.e. sampled from) very quickly. This allows us to find distributions over unknown parameters in the model, and perform calibrated predictions.
The approach Leeds takes in his first work on the subject is to use a non-standard observation model (in his case a truncated normal distribution) and define a first-order emulator in the process layer. The emulator is constructed by taking the dominating singular vectors from model outputs (over time), and fitting a model to the respective weights with respect to the model parameters. Once this model is fit (which can be very general and non-linear), MCMC can be used to sample the parameters and process in the usual manner. The spatial element comes in by assuming spatial correlation between the model parameters. In the second work, the approach is extended to a dynamic setting. Now EOFs (from the model output) are used for the large scale features and the emulator is used (indirectly) to provide information about the propagation matrices and other parameters in the model. This paper shows, in a very neat way, how only certain aspects (characterisations?) of numerical model output is used in our process model. This was also the subject of a recent work of mine here.
On the whole, this paper was a very pleasant read and contains a ton of useful references which will come in handy for anyone interested in spatio-temporal BHMs.