While wili has already referenced the linked article by Kevin Cowtan in another thread, I thought that I would provide the following quote and two associated images (Figures 1 & 2, respectively); which emphasize the importance of correcting (reconstructing) instrument measurements of mean global temperature change (whether by NASA, NOAA, Hadley or others) for such factors as Arctic coverage and thermal inertia, when using this data to validate the climate change projections of GCMs. Cowtan's Figure 2 illustrates how when correcting for such factors calculated transient climate response, TCR (which is a measure of climate sensitivity), can change from 1.3 (ala Otto et al) to 1.6 (ala Cowtan); which is a dramatic difference.
http://www.skepticalscience.com/kevin_cowtan_agu_fall_2014.htmlQuote: "Let's start by looking at the current version of our temperature reconstruction, created by separate infilling of the Hadley/CRU land and ocean data. The notable differences are that our reconstruction is warmer in the 2000's (due to rapid arctic coverage), and around 1940, and cooler in the 19th century due to poor coverage in HadCRUT4 (figure 1).
What impact do these differences have on our understanding of climate? The most important factor in determining the rate of climate change over our lifetimes is climate sensitivity, and in particular the Transient Climate Response (TCR). TCR measures how much global temperatures will change over a few decades due to a change in forcing, for example due to a change in greenhouse gas concentrations. It is therefore important from a policy perspective. We can look at the effect of our work on TCR estimates.
One widely reported estimate of TCR comes from a 2013 paper on climate sensitivity by Otto et al., from which figure 2(a) below is derived. The origin represents a reference period in the 19th century (specifically 1860-1879), while the data points represent the change in temperature (y-axis), against the forcing or driver or climate change (x-axis) for the 1970s, 1980s, 1990s and 2000s.
The slope of a line through these points gives an estimate how much temperature will change due to future changes in forcing. This is expressed in terms of the transient climate sensitivity (TCR), shown in figure 2(b). The Otto paper attracted some comment due to the TCR estimate being a little lower than is typically reported for climate models.
Note in particular the last datapoint, which lies almost on the line. The surface warming slowdown of the 2000s, commonly known as the 'hiatus', does not affect the estimate of climate sensitivity in the Otto et al. calculation.
How does our temperature reconstruction (Cowtan & Way 2014) affect this study? The answer is shown by the green points in figure 2(c). All the data points move upwards – this is actually due to the reference period in the 19th century being cooler in our data. The last data point moves further, reflecting the warmer temperatures in the 2000s. The transient climate sensitivity (TCR) increases accordingly.
One other feature of the Otto et al. calculation is that it ignores the thermal inertia of the system. In reality it takes a while for surface temperature to respond to a change in atmospheric composition: temperature change lags forcing. We can approximate this response by delaying the forcing a little (specifically by convolution with an exponential lag function with an e-folding time of 4 years, normalised to unit TCR). This gives the blue points in figure 2(d). The fit is a little better, and the TCR is now not far off from the models."