Sam,
Hard to argue with that. Really hard. Unless you have access to a time machine, models are all we have to project the future.
Ken
And so it is that we use them, being ever mindful of their limitations. Even then, we will miss stuff, sometimes really important stuff. And we learn.
It is hugely tempting sometimes to substitute the model for reality. That is always dangerous. Sometimes we forget just how dangerous.
Reality is. Reality is always the gold standard. Reality is. Reality is messy and hard. Sometimes we can reach an understanding of reality from the level of first principles. Even then, substituting models of those laws runs the serious risk of getting things entirely wrong.
Models emulate reality to varying degrees of fidelity under varying limitations and conditions. They are immensely useful. But they do not control reality. Reality imposes limits on them, to the degree they are able to represent reality. Often, the simplifications that the models include are useful in highlighting really really important simple principles, e.g. the first order linear response the authors noted, within the bounds of the data sets they used. And those give us great insights.
But, and it is a big caveat, they do not govern, and they do not easily extrapolate beyond the limitations of conditions under which the original data was gathered. We can and often must extrapolate. In doing so, it is vital that we know that we are doing so, and that we constantly sanity check the results. Simple assumptions, or more often unrecognized and unstated assumptions, may and often do rear their very ugly heads if we extrapolate too far from the data set.
Changes in state and condition generally render the models invalid. Following the state or condition change, a new model is needed. That may be identical to the previous model. Usually it isn’t. Often it is radically different in form. Examples of this occur all over the place. Some of these include:
A) The physics and condition changes in the atmosphere on either side of a supersonic shock boundary. This is encountered with all supersonic craft. The equations are unusual but straight forward.
B) Water flow through soil under advective conditions encountering a paleosol boundary, and entering a different strata underneath it. In this case, at least three models are needed (more likely four of five, or more): one for the adjective transport, one for encounter with the boundary, one for the transport on the boundary, one for entry into the boundary layer, one for transport through the paleosol boundary, one for encounter with the bottom boundary, one for movement in the paleosol laterally, one for movement through the paleosol boundary with underlying sediment, one for entry into the underlying sediment, and or transport on the boundary, and one for transport through the underlying soil. That’s twelve in total that may be important (there may be several more that are essential, and that is based on a homogenous unfractured soil).
There are scale changes between these that render gridded analysis invalid. It may be possible to emulate this with simpler models under certain conditions. But, using simpler models hand waves over key factors which may then make the simplified model completely inaccurate with a change of conditions. (E.g. flooded flow, versus dry conditions, versus variable conditions, ...). And that all excludes other factors such as barometric oscillation, temperature oscillation affecting vapor production and transport, or condensation ... let alone factors such as vegetation, microbial growth, chemical deposition and redissolution ...
C) Cloud formation modeling ... now there is a tough one.
On first blush it might appear that these are ordered from hardest to simplest. In reality, it is the other way around. The fine scale variations in B) and C) make them much more challenging.
Worse, these days we lean heavily on computational models. That wasn’t always the case. Computational models most often are created from gridded multidimensional arrays. The number of nodes, their spacing, and the property calculations at each node accumulate quickly. The total size of the computational space then becomes huge. And so, the number of nodes are reduced and the properties are averaged across the volume.
If done correctly, this is useful. If done incorrectly, all manner of insanity can be hidden. Often, as in the case of soils, fine scale changes must be modeled at their scale. This imposes limitations on the reduction in the number of nodes in the model that render the model unworkable. But keeping the nodes at the required spacing is also unworkable. There are more of them than all of the computers in the world could calculate in a lifetime. So, compromises are made. But, those compromises can do silly things like averaging logarithmic or nonlinear parameters. Nope. That doesn’t work. But it happens all of the time. Etc...
Worse, the time step used must be correlated with the spatial step in certain ways to prevent impossibilities and insanities from occurring in the math. These are easy to miss.
Bounds checking is often turned off to speed the code execution. This hides all manner of other errors, such as indexing beyond the bounds of the computational matrix, and divide by zero errors. That’s not good. Efforts are made to avoid those blunders. And sometimes they still slip through.
And through it all uncertainties accumulate. Some models propagate these. Most do not. Instead they attempt to rely on a variation of parameters method using Monte Carlo and a latin hypercube approach to estimate the uncertainty. This greatly understates the uncertainty though as it hand waves away much of the very real uncertainty. Some approaches such as USGS’s Jupiter Suite that uses multi-model comparison to better estimate uncertainty are better. These are seldom used.
Instead, the all too common approach is to further simplify the model structure, rely on sensitivity calculation to substitute for uncertainty estimation, and shift to Bayesian methods that further obscure or discard uncertainty. Those models are particularly vulnerable to errors in their design.
Sam