Support the Arctic Sea Ice Forum and Blog

Author Topic: Validation of GCM Models  (Read 49161 times)

Sciguy

  • Nilas ice
  • Posts: 1976
    • View Profile
  • Liked: 239
  • Likes Given: 188
Re: Validation of GCM Models
« Reply #100 on: May 25, 2021, 12:17:39 AM »
^^^^^^^^^^^^^

Sciguy,

you are incorrect in assuming that the 'wolfpack' grouping of high GCMs in the CMIP6 are inferred as being incorrect in this paper.

This paper is referencing an earlier work from Peter Caldwell That was performing a verifiable check on parameters that could be validated. 

The models they mention are specifically in reference to these models and of these only the ones that specifically have the tradewind cumulous cloud effect within them.

It is therefore incorrect to assert that this specific study invalidates the higher sensitivities since they may not correlate at all and the portion that does correlate may only be a small component of the total higher sensitivity.

See:  https://journals.ametsoc.org/view/journals/clim/31/10/jcli-d-17-0631.1.xml?tab_body=pdf

Evaluating Emergent Constraints on Equilibrium Climate Sensitivity
Peter M. Caldwell1, Mark D. Zelinka1, and Stephen A. Klein1
View More
Print Publication: 15 May 2018

and: https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2019GL085782

Which holds the caveat "Because the number of models analyzed in this study (27 CMIP6 models from 19 distinct modeling centers) is much less than the 102 models from 35 centers expected to perform abrupt-4xCO2 and piControl experiments, we caution that some conclusions may change as more data become available."

Jai,

The Caldwell (2018) paper you refer to doesn't appear in the references to the Myer (2021) paper in my post. Perhaps you were thinking of this recent paper on emergent constraints that referenced Caldwell (2018). It found that ECS is likely to be between 1.9 and 3.4 C, which again appears to show that the "wolfpack" models have too high climate sensitivity.

https://esd.copernicus.org/articles/11/737/2020/

Quote

Nijsse, F. J. M. M., Cox, P. M., and Williamson, M. S.: Emergent constraints on transient climate response (TCR) and equilibrium climate sensitivity (ECS) from historical warming in CMIP5 and CMIP6 models, Earth Syst. Dynam., 11, 737–750, https://doi.org/10.5194/esd-11-737-2020, 2020.

Abstract

Climate sensitivity to CO2 remains the key uncertainty in projections of future climate change. Transient climate response (TCR) is the metric of temperature sensitivity that is most relevant to warming in the next few decades and contributes the biggest uncertainty to estimates of the carbon budgets consistent with the Paris targets. Equilibrium climate sensitivity (ECS) is vital for understanding longer-term climate change and stabilisation targets. In the IPCC 5th Assessment Report (AR5), the stated “likely” ranges (16 %–84 % confidence) of TCR (1.0–2.5 K) and ECS (1.5–4.5 K) were broadly consistent with the ensemble of CMIP5 Earth system models (ESMs) available at the time. However, many of the latest CMIP6 ESMs have larger climate sensitivities, with 5 of 34 models having TCR values above 2.5 K and an ensemble mean TCR of 2.0±0.4 K. Even starker, 12 of 34 models have an ECS value above 4.5 K. On the face of it, these latest ESM results suggest that the IPCC likely ranges may need revising upwards, which would cast further doubt on the feasibility of the Paris targets.

Here we show that rather than increasing the uncertainty in climate sensitivity, the CMIP6 models help to constrain the likely range of TCR to 1.3–2.1 K, with a central estimate of 1.68 K. We reach this conclusion through an emergent constraint approach which relates the value of TCR linearly to the global warming from 1975 onwards. This is a period when the signal-to-noise ratio of the net radiative forcing increases strongly, so that uncertainties in aerosol forcing become progressively less problematic. We find a consistent emergent constraint on TCR when we apply the same method to CMIP5 models. Our constraints on TCR are in good agreement with other recent studies which analysed CMIP ensembles. The relationship between ECS and the post-1975 warming trend is less direct and also non-linear. However, we are able to derive a likely range of ECS of 1.9–3.4 K from the CMIP6 models by assuming an underlying emergent relationship based on a two-box energy balance model. Despite some methodological differences; this is consistent with a previously published ECS constraint derived from warming trends in CMIP5 models to 2005. Our results seem to be part of a growing consensus amongst studies that have applied the emergent constraint approach to different model ensembles and to different aspects of the record of global warming.




jai mitchell

  • Nilas ice
  • Posts: 2390
    • View Profile
  • Liked: 210
  • Likes Given: 62
Re: Validation of GCM Models
« Reply #101 on: May 25, 2021, 07:17:58 PM »
Sciguy,

The Caldwell paper from 2018 specifically looked at tropical low cloud feedbacks as a constraint.

Any attempt to place constraints on TCR using historical values denies the existence of future tipping points.

If you do enjoy reading the climate science, I would heartily recommend you do a full read of the Caldwell paper and, if you enjoy it, I always recommend: 

https://acp.copernicus.org/articles/11/13421/2011/acp-11-13421-2011.pdf

Earth’s energy imbalance and implications
J. Hansen1,2, M. Sato1,2, P. Kharecha1,2, and K. von Schuckmann3
2011

« Last Edit: May 25, 2021, 07:25:25 PM by jai mitchell »
Haiku of Futures Passed
My "burning embers"
are not tri-color bar graphs
+3C today

Sciguy

  • Nilas ice
  • Posts: 1976
    • View Profile
  • Liked: 239
  • Likes Given: 188
Re: Validation of GCM Models
« Reply #102 on: May 25, 2021, 10:15:57 PM »
Sciguy,

The Caldwell paper from 2018 specifically looked at tropical low cloud feedbacks as a constraint.

Any attempt to place constraints on TCR using historical values denies the existence of future tipping points.

If you do enjoy reading the climate science, I would heartily recommend you do a full read of the Caldwell paper and, if you enjoy it, I always recommend: 

https://acp.copernicus.org/articles/11/13421/2011/acp-11-13421-2011.pdf

Earth’s energy imbalance and implications
J. Hansen1,2, M. Sato1,2, P. Kharecha1,2, and K. von Schuckmann3
2011

Jai,

Caldwell 2018 looked at emerging constraints in CMIP3 and CMIP5 models and found that shortwave cloud feedback was "a dominant contributor to correlations with ECS because it is the largest source of intermodel spread in ECS."

The same team, with a different lead author, found that "positive cloud feedbacks from decreasing extratropical low cloud coverage and albedo" was the largest contributor to the differences in climate sensitivity in the CMIP6 models.

https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2019GL085782

Quote
Causes of Higher Climate Sensitivity in CMIP6 Models
Mark D. Zelinka, Timothy A. Myers, Daniel T. McCoy, Stephen Po-Chedley, Peter M. Caldwell, Paulo Ceppi, Stephen A. Klein, Karl E. Taylor
First published: 03 January 2020
https://doi.org/10.1029/2019GL085782

 Abstract

Equilibrium climate sensitivity, the global surface temperature response to CO urn:x-wiley:grl:media:grl60047:grl60047-math-0001 doubling, has been persistently uncertain. Recent consensus places it likely within 1.5–4.5 K. Global climate models (GCMs), which attempt to represent all relevant physical processes, provide the most direct means of estimating climate sensitivity via CO urn:x-wiley:grl:media:grl60047:grl60047-math-0002 quadrupling experiments. Here we show that the closely related effective climate sensitivity has increased substantially in Coupled Model Intercomparison Project phase 6 (CMIP6), with values spanning 1.8–5.6 K across 27 GCMs and exceeding 4.5 K in 10 of them. This (statistically insignificant) increase is primarily due to stronger positive cloud feedbacks from decreasing extratropical low cloud coverage and albedo. Both of these are tied to the physical representation of clouds which in CMIP6 models lead to weaker responses of extratropical low cloud cover and water content to unforced variations in surface temperature. Establishing the plausibility of these higher sensitivity models is imperative given their implied societal ramifications.

Now Caldwell, Myers and Zelinka, with a few others, have compared observations of cloud behavior, including extratropical low clouds, and found that the high sensitivity models don't reproduce the behavior properly.  Here's the link again:

https://www.nature.com/articles/s41558-021-01039-0

Quote
    Published: 13 May 2021

Observational constraints on low cloud feedback reduce uncertainty of climate sensitivity
Timothy A. Myers, Ryan C. Scott, Mark D. Zelinka, Stephen A. Klein, Joel R. Norris & Peter M. Caldwell
Nature Climate Change (2021)

Abstract

Marine low clouds strongly cool the planet. How this cooling effect will respond to climate change is a leading source of uncertainty in climate sensitivity, the planetary warming resulting from CO2 doubling. Here, we observationally constrain this low cloud feedback at a near-global scale. Satellite observations are used to estimate the sensitivity of low clouds to interannual meteorological perturbations. Combined with model predictions of meteorological changes under greenhouse warming, this permits quantification of spatially resolved cloud feedbacks. We predict positive feedbacks from midlatitude low clouds and eastern ocean stratocumulus, nearly unchanged trade cumulus and a near-global marine low cloud feedback of 0.19 ± 0.12 W m−2 K−1 (90% confidence). These constraints imply a moderate climate sensitivity (~3 K). Despite improved midlatitude cloud feedback simulation by several current-generation climate models, their erroneously positive trade cumulus feedbacks produce unrealistically high climate sensitivities. Conversely, models simulating erroneously weak low cloud feedbacks produce unrealistically low climate sensitivities.

To summarize:

In 2018 they (Caldwell, Myers, Zelinka and others) found that emerging constraints, including cloud behavior, can be used to reduce climate sensitivity in CMIP3 and CMIP5 models.

In 2020 they found that the key difference between high sensitivity (higher than the long established 4.5C upper bound) CMIP6 models and the models that were in the long established range of climate sensitivity (2C to 4.5C) was low cloud feedbacks (and the associated albedo changes).

In 2021, Caldwell, Myers and Zelinka (and others) found that emerging constraints, including cloud behavior, could be used to reduce the uncertainty range for climate sensitivity in CMIP6 models.  They found that observed cloud behaviors in response to the warming seen to date implies a climate sensitivity around 3C.  The CMIP6 models with high sensitivity produced too much cloud feedback compared to observations.  (The climate models with low sensitivity produced too little).

James Hansen is a great climate scientist.  I've read his work and highly recommend it.  Here is the link to the paper whose title you posted:

https://acp.copernicus.org/articles/11/13421/2011/acp-11-13421-2011.pdf

Here is how he and his co-authors described climate sensitivity in that paper published in 2011:

Quote
Fast-feedback  climate  sensitivity  has  been  estimated  in innumerable  climate  model  studies,  most  famously  in  the Charney  et  al.  (1979)  report  that  estimated  equilibrium global warming of 3◦C±1.5◦C for doubled CO2(a forcing of  4 W m−2),  equivalent  to  0.75◦C±0.375◦C  per W m−2.Subsequent  model  studies  have  not  much  altered  this  esti-mate or greatly reduced the error estimate,  because of un-certainty as to whether all significant physical processes are included in the models and accurately represented. The range of model results in the IPCC (Randall et al., 2007) report was 2.1–4.4◦C for doubled CO2.  A recent analysis (Schmittner et al., 2011) reported a median sensitivity 2.3◦C for doubledCO2,  but their result becomes 3.0◦C (see their Supporting Material) when the glacial-interglacial aerosol change is cat-egorized as a fast feedback for consistency with other studies, as discussed by Hansen and Sato (2012).

Empirical assessment of the fast-feedback climate sensitivity can be extracted from glacial-interglacial climate oscillations, during which Earth was in quasi-equilibrium with slowly changing boundary forcings (Hansen and Sato, 2012).This  assessment  depends  on  knowledge  of  global  temperature  change  and  the  GHG  and  surface  albedo  forcings, the  latter  depending  mainly  upon  ice  sheet  size  and  thus upon  sea  level.   Hansen  and  Sato  (2012)  use  data  for  the past 800 000 yr to conclude that the fast-feedback sensitivity is 0.75◦C±0.125◦C per W m−2, which is equivalent to 3◦C ± 0.5◦C for doubled CO2.   This 1-σ error estimate is necessarily partly subjective.  We employ fast-feedback climate sensitivity 0.75◦C per W m−2 in our present study.

So although Hansen used higher climate sensitivities in his earlier models, by 2011 he was using 3C.  His successors at NASS GISS (Hansen is now retired) have two CMIP6 models, one with an ECS of 2.4 (GISS-E2-2-G) and one with an ECS of 3.1 (GISS-E2-1-H).

And Myers, Caldwell and Zelinka have shown that using cloud feedbacks as an emergent constraint, the climate sensitivity is around 3C.

jai mitchell

  • Nilas ice
  • Posts: 2390
    • View Profile
  • Liked: 210
  • Likes Given: 62
Re: Validation of GCM Models
« Reply #103 on: May 27, 2021, 01:09:27 AM »
Quote
In 2018 they (Caldwell, Myers, Zelinka and others) found that emerging constraints, including cloud behavior, can be used to reduce climate sensitivity in CMIP3 and CMIP5 models.

The 2018 paper actually found 4 constraints that were valid and verifiable but their associated models that included their feedbacks produced a sensitivity greater than 4.0C  So this statement above is not correct.

The problem with the emerging constraints work on model sensitivity projections is that the processes are chaotic and undefined as we approach higher forcing and surface temperatures.  For instance, I live in Northern California near the coast, we just took tree rings from a 50 year old oak tree and we now have a wet season record for the last 50 years.

It turns out that , for this microclimate region, we have had a severe (worst on the 50 year record) drought for the last 10+years  this is far and above the simple precipitation records since it also implies much more soil drying and less marine layer intrusion due to warmer surface temperatures destroying the fog belt, a major source of soil moisture as well as albedo in the pacific northwest.

In other words, the historical climate narrative for the last 50 years is not as important as the last 5 when it comes to future implications of fast feedback responses not included in the current GCMs.
Haiku of Futures Passed
My "burning embers"
are not tri-color bar graphs
+3C today

jai mitchell

  • Nilas ice
  • Posts: 2390
    • View Profile
  • Liked: 210
  • Likes Given: 62
Re: Validation of GCM Models
« Reply #104 on: October 29, 2021, 08:08:21 PM »
This is a great presentation by Mark Zelinka at Lawrence Livermore National Lab on climate models and high ECS and whether or not they are accurate (we think).

Haiku of Futures Passed
My "burning embers"
are not tri-color bar graphs
+3C today

jai mitchell

  • Nilas ice
  • Posts: 2390
    • View Profile
  • Liked: 210
  • Likes Given: 62
Re: Validation of GCM Models
« Reply #105 on: December 29, 2021, 08:45:57 PM »
This graphic is intended to show what our current trajectory of emissions will produce over geological time scales.

However it only shows about 3C of peak warming but the CO2 equivalent forcing is about 2.5X CO2 doubling!

Talking with the author, the ECS of the model used was 3.5C.

Something isn't right in the models.
Haiku of Futures Passed
My "burning embers"
are not tri-color bar graphs
+3C today

vox_mundi

  • Multi-year ice
  • Posts: 10466
    • View Profile
  • Liked: 3536
  • Likes Given: 762
Re: Validation of GCM Models
« Reply #106 on: December 30, 2022, 03:59:48 AM »
Scientists Use Powerful New Climate Model to Recreate Iconic ‘Blue Marble’ Photograph
https://www.extremetech.com/extreme/341798-scientists-use-powerful-new-climate-model-to-recreate-iconic-blue-marble-photograph



Apollo 17 in 1972 was the most recent crewed mission to leave Earth behind in its voyage to the Moon. The mission is notable for another reason: the famous “Blue Marble” photograph of Earth fully illuminated by the sun. This stunning image supercharged the nascent environmental movement in the early 1970s and has been reproduced frequently over the decades.

Now, it’s being reproduced in a new way — scientists from the Max Planck Institute for Meteorology attempted to recreate the Blue Marble shot from scratch using a new climate model. The results, as you can see above, are impressive. The simulation on the right is a dead ringer for the authentic photo on the left.

... In late 2022, the institute completed work on a model known as ICON that can simulate fully coupled climate systems at kilometer scale. Climate scientists believe that being able to resolve atmospheric processes at this level of detail is essential to understanding how global warming is reshaping the globe. With the 50-year anniversary of the iconic Blue Marble photo, the MPI-M team decided to test the model by recreating the image using weather data from 1972.

https://mpimet.mpg.de/en/communication/news/single-news/neuer-blick-auf-das-blue-marble-foto-icon-simuliert-das-gekoppelte-klimasystem-mit-1-km-aufloesung



MPI-M partnered with Nvidia and the German Climate Computing Center to run the simulation, beginning with a spun-up ocean two days before the famous image was captured. The institute describes it as a two-day forecast arriving fifty years too late. By simulating the physics of surface winds, water currents, and cloud fields, the team ended up with something that very closely matches the real photograph.

The image above was generated with the Nvidia Omniverse platform with internal RTX ray tracing. And it’s not just a photograph. The simulation is a fully formed world, featuring details not recorded by the Apollo crew’s camera. The simulation allowed the team to go beyond the superficial, studying the waves of warm water emanating from the African coast and subsurface eddies that rise upward as the sun heats the ocean. The team expressed pride that ICON was able to so closely mirror the Blue Marble, particularly given the rudimentary weather data from 50 years ago in the southern hemisphere. This work shows that ICON could become a key tool in understanding Earth’s changing climate.
There are 3 classes of people: those who see. Those who see when they are shown. Those who do not see

Insensible before the wave so soon released by callous fate. Affected most, they understand the least, and understanding, when it comes, invariably arrives too late

Fiat iustitia, et pereat mundus

vox_mundi

  • Multi-year ice
  • Posts: 10466
    • View Profile
  • Liked: 3536
  • Likes Given: 762
Re: Validation of GCM Models
« Reply #107 on: January 04, 2023, 07:24:03 PM »
Tipping Points Complicate the Evaluation of Complex Climate Models
https://phys.org/news/2023-01-complicate-complex-climate.html

An analysis by Robbin Bastiaansen and Anna von der Heydt, the University of Utrecht, the Netherlands; and Peter Ashwin, the University of Exeter, UK, indicates that it might remain difficult to accurately find the equilibrium climate sensitivity in complex climate models. The equilibrium climate sensitivity is used to compare and evaluate models and is calculated using a limited set of data from a relatively short simulation. But such results could be heavily underestimating long-term warming as late climate tipping cannot be excluded by the commonly used methods to estimate equilibrium climate sensitivity, conclude the authors. The work is part of the European TiPES project on tipping points in the Earth system.

The equilibrium climate sensitivity is an important number in climate science because it is well suited for the comparison and evaluation of climate models. The number is defined as the total rise in global mean temperature after a doubling of CO2 in the atmosphere. Because the Earth system is large and complex, reaching a final equilibrium temperature takes thousands of years.

State-of-the-art climate models, however, require months of calculations on supercomputers to simulate even 150 years of climate change. Therefore, it is not feasible to have the models run for years on end to simulate thousands of years of climate change in order to find the model's equilibrium climate sensitivity.

Instead, a simpler method is used: After a model has simulated a couple of hundred years of climate evolution, the data is collected and then used to further estimate how much the mean global temperature goes up if the model was allowed to run until the equilibrium temperature was reached.

However, this commonly used method might be underestimating temperature rise. As Bastiaansen and the team illustrate in the study "Climate Response and Sensitivity: Timescales and Late Tipping Points," published today in Proceedings of the Royal Society A, these methods can fail in simple climate models, and therefore also might be inadequate for larger, state-of-the-art climate models.

One problem the group identifies is that climate models, as well as the real climate system, might show a sudden fast temperature increase even after years of a seemingly stable climate. In other words, an abrupt transition in a part of the climate system later than 150 years such as the partial collapse of an ice sheet or sudden desertification of a large part of a continent can greatly influence the global mean temperature and current methods are inapt to estimate the resulting warming.

Climate Response and Sensitivity: Timescales and Late Tipping Points, Proceedings of the Royal Society A Mathematical Physical and Engineering Sciences (2023)
https://royalsocietypublishing.org/doi/10.1098/rspa.2022.0483
There are 3 classes of people: those who see. Those who see when they are shown. Those who do not see

Insensible before the wave so soon released by callous fate. Affected most, they understand the least, and understanding, when it comes, invariably arrives too late

Fiat iustitia, et pereat mundus

gerontocrat

  • Multi-year ice
  • Posts: 21062
    • View Profile
  • Liked: 5322
  • Likes Given: 69
Re: Validation of GCM Models
« Reply #108 on: January 06, 2023, 02:00:34 PM »
Tipping points
Some extracts from the science paper below on which the physio.org article posted by vox_mundi above is based. It is open access.

Note: The paper uses maths well above my pay grade (even all those many years ago when my ancient maths degree was freshly minted). It is the suggestions in the introduction (italicised by me) that are not only alarming but strike a chord (especially since my little projections algorithm on Antarctic sea ice was recently demolished by extreme sea ice losses in December).

https://royalsocietypublishing.org/doi/10.1098/rspa.2022.0483
Climate response and sensitivity: time scales and late tipping points
Quote
Abstract
Climate response metrics are used to quantify the Earth’s climate response to anthropogenic changes of atmospheric CO2. Equilibrium climate sensitivity (ECS) is one such metric that measures the equilibrium response to CO2 doubling. However, both in their estimation and their usage, such metrics make assumptions on the linearity of climate response, although it is known that, especially for larger forcing levels, response can be nonlinear. Such nonlinear responses may become visible immediately in response to a larger perturbation, or may only become apparent after a long transient period. In this paper, we illustrate some potential problems and caveats when estimating ECS from transient simulations. We highlight ways that very slow time scales may lead to poor estimation of ECS even if there is seemingly good fit to linear response over moderate time scales. Moreover, such slow processes might lead to late abrupt responses (late tipping points) associated with a system’s nonlinearities. We illustrate these ideas using simulations on a global energy balance model with dynamic albedo. We also discuss the implications for estimating ECS for global climate models, highlighting that it is likely to remain difficult to make definitive statements about the simulation times needed to reach an equilibrium.

1. Introduction
The central question as to how the climate is likely to change as a function of anthropogenic CO2 emissions can be posed as ‘How does an observation of the climate system respond to changes in its radiative forcing induced by changes in atmospheric CO2?’. This question has been studied in various ways for over a century [1,2], although efforts to answer it became more intense and in-depth over the last decades. Among early efforts was the pioneering work by Charney et al. in 1979, who made the first estimates of expected equilibrium warming after doubling of atmospheric CO2 (while keeping vegetation and land ice fixed at present-day values) using a numerical global climate model (GCM) [3]. This metric has later been named the equilibrium climate sensitivity (ECS) and is still widely used. Since then, researchers have developed a number of different metrics that measure climate response to different scenarios of anthropogenic change in CO2 and have incorporated information from other sources besides computer models, including historical observations and data from palaeoclimate records. Recently, these efforts were summarized in an assessment of the World Climate Research Programme [4] that synthesized different quantifications of climate response using these different lines of evidence and led to the headline that the Earth’s ECS is likely between 2.6 K and 3.9 K.

One of the hurdles for this assessment was the variety of definitions of (the quantification of) climate sensitivity—and ECS especially—in the literature. The root of this problem can be attributed to the lack of data on equilibrium climate states or detailed long-term transient data. This can be due to low time resolutions in proxy data, lack of observational data or insufficient computing power to equilibrate modern GCMs. Consequently, equilibrium properties need to be estimated from incomplete datasets, leading to many slightly different ways to quantify climate sensitivity. Common to them all, however, is the need to extrapolate long-term dynamics from data on shorter time scales. In this paper, we discuss this extrapolation process, focusing on estimates of ECS using (idealized) experiments in climate models for the sake of mathematical simplicity. Of particular interest here is the exploration of linear and nonlinear, dynamics that can emerge in multi-scale dynamical systems, and their problematic effects on extrapolation.

The common way to obtain estimates of ECS in climate models involves the use of extrapolation and regression methods on non-equilibrated transient simulations—typically of 150-year long runs. Values for ECS obtained in this way are now often referred to as the effective climate sensitivity [5] signalling that it might not encompass all long-term climate change. Although there are many different ways to perform such extrapolation, they are usually based on linear concepts and frameworks. A recent review [6] of climate sensitivity highlighted that it is a key challenge to study the limits of such linear frameworks. Here, we will investigate these limits and in the process highlight the trade-offs that need to be made when designing experiments to quantify ECS: in order to measure a clear signal of warming in relation to the noise of natural variations, large perturbations are desirable but precisely in the case of larger perturbations the nonlinear behaviour becomes important and linear frameworks break down.

One of the most important tools to study past and future climate change are the GCMs used in the Coupled Model Intercomparison Projects (CMIP, e.g. [7]), because they provide a globally complete and detailed representation of the climate state while (approximately) satisfying the physical laws. However, specifically for these large models there is no way to determine whether a model really has arrived in the linear regime near an equilibrium, or even if such an equilibrium exists. In this paper we explore some simple conceptual examples of the potential nonlinear dynamics of the climate. We also make a number of observations that we hope illuminate some of the limitations of linear frameworks:

(i)   
We highlight cases where there may be strong dependence on the climate background state and the forcing levels.

(ii)   
We highlight examples where there may be a good fit to transient data but poor extrapolation preventing an accurate estimation of the ECS.

(iii)   
We show that nonlinear systems can have slow tipping points. When these are crossed the tipping dynamics play out on slow time scales, and it can take arbitrarily long times before nonlinear and/or asymptotic behaviour is observed.

(iv)   
We demonstrate how in the presence of multiple time scales with nonlinear feedbacks a late tipping can occur in which fast processes suddenly dominate after arbitrarily long slow transient behaviour. This highlights the potential for slow and/or late tipping points to be particular obstructions to estimating ECS.
"Para a Causa do Povo a Luta Continua!"
"And that's all I'm going to say about that". Forrest Gump
"Damn, I wanted to see what happened next" (Epitaph)

Richard Rathbone

  • Nilas ice
  • Posts: 1765
    • View Profile
  • Liked: 390
  • Likes Given: 24
Re: Validation of GCM Models
« Reply #109 on: January 06, 2023, 02:43:58 PM »
These sorts of arguments are one reason I'd like to see everyone, and not just Hansen, publish the full response curve as well as the end value. I've read papers on cloud modelling that find tipping points in the 1000-2000 ppm region and models can agree they are there, disagree to a trivial extent about when they activate, and one will include the tipping point in its ECS number and the other won't.


kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #110 on: January 06, 2023, 05:08:47 PM »
The problem is that we don´t really have data to compare the full curve too. Most recent data is of course from ice age bounds which can only give us the lower values of the response. Of course we also produced a very strong greenhouse gas push which makes things more complicated.

This is also because warming by greenhouse gasses is different from the ice age forcings.

There was one study which had a certain cloud type disappeared at 1200 ppm. But this is the boundary they could work out from the rainfall data plus models. Of course the models are not 100% correct so that error is in there. And the data only give us the off switch but there is not one for clouds they will just react to the temperature push.

For models we need a more accurate scale calculations for clouds which is very computation intensive.

I still think we should have build some very powerful computer center to run some high scale climate modelling to do such things.
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

The Walrus

  • Young ice
  • Posts: 2966
    • View Profile
  • Liked: 154
  • Likes Given: 498
Re: Validation of GCM Models
« Reply #111 on: January 06, 2023, 10:43:48 PM »
These sorts of arguments are one reason I'd like to see everyone, and not just Hansen, publish the full response curve as well as the end value. I've read papers on cloud modelling that find tipping points in the 1000-2000 ppm region and models can agree they are there, disagree to a trivial extent about when they activate, and one will include the tipping point in its ECS number and the other won't.

Tipping points are notoriously difficult to pinpoint.  Once reached, the model fails, as the tipping point leads to a new regime outside the modelling parameters.  Including a tipping point in an ECS determination can result in some rather extreme values, well putside the normal distribution.  But, then again, that is why they are tipping points, as the entire regime changes.

vox_mundi

  • Multi-year ice
  • Posts: 10466
    • View Profile
  • Liked: 3536
  • Likes Given: 762
Re: Validation of GCM Models
« Reply #112 on: January 06, 2023, 11:14:55 PM »
Amazon Rainforest Deforestation Is Influencing Weather In Tibet
https://phys.org/news/2023-01-amazon-rainforest-deforestation-weather-tibet.html



An international team of climate scientists has found evidence suggesting that deforestation in the Amazon rainforest is influencing weather in Tibet, more than 15,000 kilometers away. In their paper published in the journal Nature Climate Change, the researchers describe possible long-range impacts of deforestation of the Amazon rainforest. Valerie Livina, with the U.K.'s National Physical Laboratory, has published a News & Views piece in the same journal issue outlining the Hopf bifurcation theory and how it relates to climate tipping points and the work done by the team on this new effort.

... In this new effort, the researchers note that cutting down the forest has been going on for decades, and climate data has been gathered during the same time period. They wondered what impact the slowly diminishing rainforest might have on distant regions around the globe. To that end, they obtained and analyzed global climate data covering the years 1979 to 2019, looking for associations.

They were surprised to find that due to tree loss, warmer temperatures in the Amazon correlated with rising temperatures in Tibet and the West Antarctic ice sheet. They also found that when it rained more in the Amazon, there tended to be less precipitation in both of the other two regions

The researchers were able to trace the route of climate change as the size of the rain forest grew smaller. Its approximate path, they saw, could be charted first to southern Africa, and then on up to the Arabian Peninsula and finally over to Tibet. The trip was found to take just a little over two weeks.

This finding, the researchers note, suggests that if a tipping point is reached in the Amazon, it could create a tipping point in Tibet, where temperatures and rainfall would be permanently impacted. They note that prior research has already shown that warming is proceeding faster in Tibet and the Arctic than the global average.



Teng Liu et al, Teleconnections among tipping elements in the Earth system, Nature Climate Change (2023).
https://www.nature.com/articles/s41558-022-01558-4

Valerie N. Livina, Connected climate tipping elements, Nature Climate Change (2023).
https://www.nature.com/articles/s41558-022-01573-5

------------------------------------------------------

Abstract

... Here, we propose a climate network approach to analyse the global impacts of a prominent tipping element, the Amazon Rainforest Area (ARA). We find that the ARA exhibits strong correlations with regions such as the Tibetan Plateau (TP) and West Antarctic ice sheet. Models show that the identified teleconnection propagation path between the ARA and the TP is robust under climate change. In addition, we detect that TP snow cover extent has been losing stability since 2008. We further uncover that various climate extremes between the ARA and the TP are synchronized under climate change. Our framework highlights that tipping elements can be linked and also the potential predictability of cascading tipping dynamics.

... The TP has attracted much attention due to its unique geological structure, irreplaceable role in global water storage and impact on the global climate system.

... the TP has been losing stability and approaching a tipping point since 2008.


----------------------------------------------------

Brazilian Amazon Deforestation Up 150% In Bolsonaro's Last Month
https://phys.org/news/2023-01-brazilian-amazon-deforestation-bolsonaro-month.html

Deforestation in the Brazilian Amazon rose 150 percent in December from the previous year, according to government figures released Friday, a final bleak report for far-right ex-president Jair Bolsonaro in his last month in office.

Satellite monitoring detected 218.4 square kilometers (84.3 square miles) of forest cover destroyed in Brazil's share of the world's biggest rainforest last month, according to the national space agency's DETER surveillance program.

The area—nearly four times the size of Manhattan—was up more than 150 percent from the 87.2 square kilometers destroyed in December 2021, according to the agency, INPE.

It was the third-worst December on record for the eight-year-old DETER program, after 2017 and 2015.

Deforestation in 2022 was also at or near record highs during the crucial dry-season months of August, September and October, when clear-cutting and fires often surge because of drier weather.

Experts say the destruction is mainly driven by farms and land grabbers clearing the forest for cattle and crops.
There are 3 classes of people: those who see. Those who see when they are shown. Those who do not see

Insensible before the wave so soon released by callous fate. Affected most, they understand the least, and understanding, when it comes, invariably arrives too late

Fiat iustitia, et pereat mundus

vox_mundi

  • Multi-year ice
  • Posts: 10466
    • View Profile
  • Liked: 3536
  • Likes Given: 762
Re: Validation of GCM Models
« Reply #113 on: January 06, 2023, 11:24:03 PM »
Quote from: kassy
I still think we should have build some very powerful computer center to run some high scale climate modelling to do such things
.
What do you think we've been running on the world's supercomputers for the past 20 years
There are 3 classes of people: those who see. Those who see when they are shown. Those who do not see

Insensible before the wave so soon released by callous fate. Affected most, they understand the least, and understanding, when it comes, invariably arrives too late

Fiat iustitia, et pereat mundus

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #114 on: January 06, 2023, 11:58:45 PM »
The scale needed for resolution of clouds is much smaller then the scale we used for the climate models the last 20 years.

So we have been running climate models that not accurately model clouds for the last 20 years. They also do not include accurate modelling of permafrost or how the whole body behaves under current climate pressures.




Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

Richard Rathbone

  • Nilas ice
  • Posts: 1765
    • View Profile
  • Liked: 390
  • Likes Given: 24
Re: Validation of GCM Models
« Reply #115 on: January 11, 2023, 08:06:44 PM »
Sanderson, B. M. and Rugenstein, M.: Potential for bias in effective climate sensitivity from state-dependent energetic imbalance, Earth Syst. Dynam., 13, 1715–1736, https://doi.org/10.5194/esd-13-1715-2022, 2022.

Quote
To estimate equilibrium climate sensitivity from a simulation where a step change in carbon dioxide concentrations is imposed, a common approach is to linearly extrapolate temperatures as a function of top-of-atmosphere energetic imbalance to estimate the equilibrium state (“effective climate sensitivity”). In this study, we find that this estimate may be biased in some models due to state-dependent energetic leaks. Using an ensemble of multi-millennial simulations of climate model response to a constant forcing, we estimate equilibrium climate sensitivity through Bayesian calibration of simple climate models which allow for responses from subdecadal to multi-millennial timescales. Results suggest potential biases in effective climate sensitivity in the case of particular models where radiative tendencies imply energetic imbalances which differ between pre-industrial and quadrupled CO2 states, whereas for other models even multi-thousand-year experiments are insufficient to predict the equilibrium state. These biases draw into question the utility of effective climate sensitivity as a metric of warming response to greenhouse gases and underline the requirement for operational climate sensitivity experiments on millennial timescales to better understand committed warming following a stabilization of greenhouse gases.

A couple of things looked at in this paper.

How long do you have to run a model to be sure you have the right asymptote to extrapolate to its equilibrium state? - For some GCMs its longer than they are actually run, and longer than its practicable to run them, with consequent bias in the published ECS values.

Quote
Our results highlight the potential for error in the estimation of effective climate sensitivity through the assumptions on the asymptotic radiative balance of climate models. In the case of LongRunMIP, there is a significant difference between the distribution of fitted asymptotic values of energetic imbalance in ABRUPT4X compared with the mean energetic balance in PICTRL in 11 of 15 models (see Table 4). In 5 out of 15 cases, this results in a bias in effective climate sensitivity of 0.3 K or more, but this bias is not universally in the same direction. Quantifying the presence of such biases in the wider CMIP6 ensemble is not possible without multi-thousand-year control and ABRUPT4X simulations. However, their relatively common occurrence in LongRunMIP suggests that more models could be impacted.

Can you trust the energy balance is being maintained accurately enough that the equilibrium state is meaningful? - For some GCMs you cant.

Quote
our results here highlight another issue, namely, that EffCS can only be used if we can be confident in the asymptotic energetic balance of the model. Such confidence can arise either from a ground-up demonstration of structural energy conservation in the model (Hobbs et al., 2016) or by running sufficiently long simulations to be empirically confident both in the pre-industrial energetic balance and in the asymptotic multi-millennial tendencies of the model following a change in climate forcing. Such experiments are currently difficult to achieve for CMIP class models; the multi-millennial-year simulations conducted in Rugenstein et al. (2020) were significantly longer than any experiments conducted previously, and we find in the present study that even a 1300-year simulation is too short to have confidence in the asymptotic state for some models.

Perhaps I shouldn't be, but I'm amazed they found stuff that means that this needs to be said (and also that the models that fail this check are taken seriously for anything not just EffCS)

Quote
Given this, our study has multiple recommendations. Firstly, a greater emphasis in climate model design and quality checking needs to be placed on structural closure of the energy budget in the climate system. Models which can demonstrate that energy is conserved in the model equations can allow confidence that the system as a whole will converge to a state of true radiative equilibrium following a perturbation, which would allow a robust calculation of EffCS.


kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #116 on: January 12, 2023, 03:46:08 PM »
Well that should help but overall it is a clunky metric. In a way the end result does not matter because if you have the actual value how does that relate to the shorter term consequences?
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

RoxTheGeologist

  • Grease ice
  • Posts: 629
    • View Profile
  • Liked: 188
  • Likes Given: 149
Re: Validation of GCM Models
« Reply #117 on: January 12, 2023, 04:57:05 PM »

Hmm - shouldn't showing convergence to energy balance should be the first tick box?

The Walrus

  • Young ice
  • Posts: 2966
    • View Profile
  • Liked: 154
  • Likes Given: 498
Re: Validation of GCM Models
« Reply #118 on: January 12, 2023, 06:47:27 PM »

Hmm - shouldn't showing convergence to energy balance should be the first tick box?

Yes, that would be a major confirmation of validity.  Short-term conditions can vary significantly.

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #119 on: January 25, 2023, 04:05:17 PM »
Microscopic dust from desert storms has been hiding the true extent of global warming, study finds

Desert storms that have sent massive plumes of dust across the oceans may have a small but significant effect on global temperatures, scientists say. New research found the microscopic particles circulating through the atmosphere had a "slight overall cooling effect on the planet" that masked just how much the planet has truly warmed over recent decades.

The UCLA research, published in Nature Reviews Earth & Environment on Tuesday, found that the amount of atmospheric dust has increased by about 55% since pre-industrial times, with many ups and downs along the way. According to lead study author Jasper Kok, that increase is likely due to changes in global climate, such as wind speeds in some deserts, as well as land-use changes, such as transforming land into agriculture and diverting water for irrigation.

But the researchers say the impact of that dust has not been adequately factored into studies of global temperature trends. The overall increase in dust, according to Kok, "could have masked up to 8% of the greenhouse warming" that's taken place since the Industrial Revolution.

...

But what we do know is that the planet has already warmed by about 2.2 degrees Fahrenheit (1.2 degrees Celsius) since the mid-1800s, with the past eight years, from 2013 to 2022, the hottest in recorded history. And Kok says if the dust had not increased, global temperatures would likely be another 0.1 degrees Fahrenheit higher.

https://www.cbsnews.com/news/climate-change-microscopic-dust-from-desert-storms-global-warming-study/

Mineral dust aerosol impacts on global climate and climate change

https://www.cbsnews.com/news/climate-change-microscopic-dust-from-desert-storms-global-warming-study/
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

vox_mundi

  • Multi-year ice
  • Posts: 10466
    • View Profile
  • Liked: 3536
  • Likes Given: 762
There are 3 classes of people: those who see. Those who see when they are shown. Those who do not see

Insensible before the wave so soon released by callous fate. Affected most, they understand the least, and understanding, when it comes, invariably arrives too late

Fiat iustitia, et pereat mundus

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #121 on: January 26, 2023, 06:05:01 PM »
Which also has better graphics. So i put it here to show that this is another factor which throws of the models. Did have that deja vu vibe.  ;)

Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

vox_mundi

  • Multi-year ice
  • Posts: 10466
    • View Profile
  • Liked: 3536
  • Likes Given: 762
Re: Validation of GCM Models
« Reply #122 on: January 31, 2023, 01:39:32 AM »
AI: World Likely to Hit Critical Warming Threshold In 10-12 Years
https://phys.org/news/2023-01-ai-world-key-threshold-.html

The world will likely breach the internationally agreed-upon climate change threshold in about a decade, and keep heating to break through a next warming limit around mid-century even with big pollution cuts, artificial intelligence predicts in a new study that's more pessimistic than previous modeling.

The study estimates that the planet could reach 1.5 degrees Celsius of warming above pre-industrial levels in a decade, and found a “substantial possibility” of global temperature rises crossing the 2-degree threshold by mid-century, even with significant global efforts to bring down planet-warming pollution.

Data shows average global temperature has already climbed risen around 1.1 to 1.2 degrees since industrialization.

“Our results provide further evidence for high-impact climate change, over the next three decades,” noted the report, published on Monday in the journal the Proceedings of the National Academy of Sciences.

Under the 2015 Paris Climate Agreement, countries have pledged to limit global warming to well below 2 degrees – and preferably to 1.5 degrees – compared to pre-industrial levels.

Scientists have identified 1.5 degrees of warming as a key tipping point beyond which the chances of extreme flooding, drought, wildfires and food shortages will increase dramatically.

Temperature rises over 2 degrees could bring catastrophic and potentially irreversible impacts, including pushing three billion people into “chronic water scarcity.”


"There will come a time when we call the 1.5C target for maximum warming dead, beyond the shadow of a doubt," Brown University environment institute director Kim Cobb, who wasn't part of the study, said in an email interview. "And this paper may be the beginning of the end of the 1.5C target."

The study used artificial neural networks – a type of machine learning or artificial intelligence – which scientists trained on climate models and then used historical observations of temperature around the world “as independent input from which the AI makes a prediction,” said Noah Diffenbaugh, a professor at Stanford University and a co-author on the study.

Diffenbaugh and his co-author Elizabeth Barnes, a professor at Colorado State University, assessed three different scenarios: Low, medium and high “forcing” climate pathways, which refer to the intensity of the heating caused by greenhouse gases in the atmosphere.

In all three scenarios, the scientists estimated that the world would hit 1.5°C of warming between 2033 and 2035, even if planet-warming pollution is substantially reduced.



Diffenbaugh said there's been so much warming already that it really doesn't matter how pollution is cut in the next several years, the world will hit 1.5, the AI figures. While “individual years are likely to reach 1.5 degrees sooner,” their predictions “are focused on how long until the global mean temperature was warmed 1.5 degrees.”

The study’s prediction is in line with previous models. In a major report published in 2022, the Intergovernmental Panel on Climate Change (IPCC) estimated that the world could cross the 1.5-degree threshold “in the early 2030s.”

Where the study departs from many current projections is in its estimates of when the world will cross the 2-degree threshold.

While the IPCC projects that in a low emissions scenario, global temperature rises are unlikely to hit 2 degrees by the end of the century, the study returned more concerning results.

The artificial intelligence-based study found it unlikely that temperature increase could be held below 2 degrees Celsius, even with tough emissions cuts.

The AI predicted a probability of around 80% that 2°C warming will be reached before 2065, even if, over the next half century, the world reaches net-zero – where it removes at least as much planet-warming pollution from the atmosphere as it emits.

If emissions stay high, Diffenbaugh said, the AI predicted a 50% probability that 2 degrees will be reached before 2050 (2043).


While many net zero decarbonization pledges and targets have been framed around holding global warming to 1.5 degrees, he added: “The AI predictions in our study suggest that those may be necessary to avoid 2 degrees.”



Half a degree of heating may not seem like a lot, but the increased impacts are exponential, intensifying a broad scale of consequences for ecosystems around the world, and the people, plants and animals that depend on them. Just a fraction of a degree of warming would increase the number of summers the Arctic would be ice-free tenfold, according to the IPCC. The difference between 1.5°C and 2°C also results in twice the amount of lost habitat for plants and three times the amount for insects.

Scorching heatwaves will become more severe and more common, occurring 5.6 times more often at the 2°C benchmark, according to the IPCC, with roughly 1bn people facing a greater potential of fatal fusions of humidity and heat. Communities around the world will have to come to grips with more weather whiplash that flips furiously between extremes.

Diffenbaugh, N. et.al. Data-driven predictions of the time remaining until critical global warming thresholds are reached, PNAS, (2023)
https://www.pnas.org/doi/10.1073/pnas.2207183120

---------------------------------------------



« Last Edit: January 31, 2023, 03:05:27 PM by vox_mundi »
There are 3 classes of people: those who see. Those who see when they are shown. Those who do not see

Insensible before the wave so soon released by callous fate. Affected most, they understand the least, and understanding, when it comes, invariably arrives too late

Fiat iustitia, et pereat mundus

Reginald

  • New ice
  • Posts: 43
    • View Profile
  • Liked: 24
  • Likes Given: 1
Re: Validation of GCM Models
« Reply #123 on: February 01, 2023, 03:38:33 PM »
...but we already knew that:

Why Global Warming Will Cross a Dangerous Threshold in 2036. Scientific American, By Michael Mann, April 1, 2014

https://www.scientificamerican.com/article/mann-why-global-warming-will-cross-a-dangerous-threshold-in-2036/

If the world continues to burn fossil fuels at the current rate, global warming will rise 2 degrees Celsius by 2036, crossing a threshold that many scientists think will hurt all aspects of human civilization: food, water, health, energy, economy and national security. In my article "False Hope" in the April 2014 Scientific American, I reveal dramatic curves that show why the world will reach this temperature limit so quickly, and also why the recent slowdown in the rate of temperature increase, if it continues, will only buy us another 10 years.


vox_mundi

  • Multi-year ice
  • Posts: 10466
    • View Profile
  • Liked: 3536
  • Likes Given: 762
Re: Validation of GCM Models
« Reply #124 on: February 01, 2023, 04:19:30 PM »
Hence, option B ...

There are 3 classes of people: those who see. Those who see when they are shown. Those who do not see

Insensible before the wave so soon released by callous fate. Affected most, they understand the least, and understanding, when it comes, invariably arrives too late

Fiat iustitia, et pereat mundus

vox_mundi

  • Multi-year ice
  • Posts: 10466
    • View Profile
  • Liked: 3536
  • Likes Given: 762
Re: Validation of GCM Models
« Reply #125 on: February 14, 2023, 05:41:57 PM »
Acceleration of Global Sea Level Rise Imminent Past 1.8°C Planetary Warming
https://phys.org/news/2023-02-global-sea-imminent-18c-planetary.html

A study published in Nature Communications by an international team of scientists shows that an irreversible loss of the West Antarctic and Greenland ice sheets, and a corresponding rapid acceleration of sea level rise, may be imminent if global temperature change cannot be stabilized below 1.8°C, relative to the preindustrial levels.

Using a new computer model, which captures for the first time the coupling between ice sheets, icebergs, ocean and atmosphere, the team of climate researchers found that an ice sheet/sea level run-away effect can be prevented only if the world reaches net zero carbon emissions before 2060. (... unfortunately, we'll hit 2.0°C before 2050)



"If we miss this emission goal, the ice sheets will disintegrate and melt at an accelerated pace, according to our calculations. If we don't take any action, retreating ice sheets would continue to increase sea level by at least 100 cm within the next 130 years. This would be on top of other contributions, such as the thermal expansion of ocean water," says Prof. Axel Timmermann, co-author of the study and Director of the IBS Center for Climate Physics.

Ice sheets respond to atmospheric and oceanic warming in delayed and often unpredictable ways. Previously, scientists have highlighted the importance of subsurface ocean melting as a key process, which can trigger runaway effects in the major marine based ice sheets in Antarctica.

"However, according to our supercomputer simulations, the effectiveness of these processes may have been overestimated in recent studies," says Prof. June Yi Lee from the IBS Center for Climate Physics and Pusan National University and co-author of the study. "We see that sea ice and atmospheric circulation changes around Antarctica also play a crucial role in controlling the amount of ice sheet melting with repercussions for global sea level projections," she adds.



Jun-Young Park, Future sea-level projections with a coupled atmosphere-ocean-ice-sheet model, Nature Communications (2023)
https://www.nature.com/articles/s41467-023-36051-9
There are 3 classes of people: those who see. Those who see when they are shown. Those who do not see

Insensible before the wave so soon released by callous fate. Affected most, they understand the least, and understanding, when it comes, invariably arrives too late

Fiat iustitia, et pereat mundus

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #126 on: April 26, 2023, 07:19:29 PM »
Massive iceberg discharges during the last ice age had no impact on nearby Greenland, raising new questions about climate dynamics


During the last ice age, massive icebergs periodically broke off from an ice sheet covering a large swath of North America and discharged rapidly melting ice into the North Atlantic Ocean around Greenland, triggering abrupt climate change impacts across the globe.

These sudden episodes, called Heinrich Events, occurred between 16,000 and 60,000 years ago. They altered the circulation of the world's oceans, spurring cooling in the North Atlantic and impacting monsoon rainfall around the world.

But little was known about the events' effect on nearby Greenland, which is thought to be very sensitive to events in the North Atlantic. A new study from Oregon State University researchers, just published in the journal Nature, provides a definitive answer.

"It turns out, nothing happened in Greenland. The temperature just stayed the same," said the study's lead author, Kaden Martin, a fourth-year doctoral candidate in OSU's College of Earth, Ocean, and Atmospheric Sciences. "They had front-row seats to this action but didn't see the show."

Instead, the researchers found that these Heinrich events caused rapid warming in Antarctica, at the other end of the globe.

The researchers anticipated Greenland, in close proximity to the ice sheet, would have experienced some kind of cooling. To find that these Heinrich Events had no discernible impact on temperatures in Greenland is surprising and could have repercussions for scientists' understanding of past climate dynamics, said study co-author Christo Buizert, an assistant professor in the College of Earth, Ocean, and Atmospheric Sciences.

"If anything, our findings raise more questions than answers," said Buizert, a climate change specialist who uses ice cores from Greenland and Antarctica to reconstruct and understand the Earth's climate history. "This really changes how we look at these massive events in the North Atlantic. It's puzzling that far-flung Antarctica responds more strongly than nearby Greenland."

Scientists drill and preserve ice cores to study past climate history through analysis of the dust and tiny air bubbles that have been trapped in the ice over time. Ice cores from Greenland and Antarctica provide important records of Earth's atmospheric changes over hundreds of thousands of years.

Records from ice cores from those regions have served as pillars for scientists' understanding of past climate events, with ice collected from both locations often telling similar stories, Martin said.

The impact of Heinrich Events on Greenland and Antarctica was not well understood, spurring Martin and Buizert to try to find out more about what was happening in those parts of the world.

The core used for the latest study was collected in 1992 from the highest point of Greenland, where the ice sheet is around 2 miles thick. Since then, the core has been in storage in the National Science Foundation Ice Core Facility in Denver.

Advancement in scientific tools and measurements over the last few decades gave Martin, Buizert and their colleagues the opportunity to re-examine the core using new methods.

The analysis shows that no changes in temperatures occurred in Greenland during Heinrich Events. But it also provides a very clear connection between Heinrich Events and the Antarctic response.

"When these big iceberg discharges happen in the Arctic, we now know that Antarctica responds right away," Buizert said. "What happens in one part of the world has an effect on the rest of the world. This inter-hemispheric connection is likely caused by change in global wind patterns."

The finding challenges the current understanding of global climate dynamics during these massive events and raises new questions for researchers, Buizert said. The researchers' next step is to take the new information and run it through climate models to see if the models can replicate what occurred.

"There has to be a story that fits all of the evidence, something that connects all the dots," he said. "Our discovery adds two new dots; it's not the full story, and it may not be the main story. It is possible that the Pacific Ocean plays an important role that we haven't figured out yet."

The ultimate goal is to better understand how the climate system is connected and how the components all interact, the researchers said.

"While Heinrich Events are not going to happen in the future, abrupt changes in the globally interconnected climate system will happen again," Martin said. "Understanding the global dynamics of the climate system can help us better project future impacts and inform how we respond and adapt."

...

https://www.sciencedaily.com/releases/2023/04/230424133551.htm
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #127 on: June 17, 2023, 05:22:18 PM »
Study examines Southern Ocean warming and its climatic impacts


The mid-to-high latitude Southern Ocean (30°S southwards) features prevailing westerly winds, the strongest mean sea-surface winds on Earth, which draw up ocean water from below 2–3 km in a wide circumpolar ring. This circulation system exerts a huge influence on climate under greenhouse warming, because the upwelled water was last in contact with the atmosphere hundreds of years earlier and once brought to the surface, absorbs a vast amount of anthropogenic heat and carbon from the atmosphere.

However, based on the latest climate models, even under the same emission scenario, inter-model differences in simulated amount of heat absorbed by the Southern Ocean are large. "The large spread is a concern, and has huge implications, including for melt of Antarctic Sea ice, ice sheets and ice shelves (center image below), radiative budget of the climate system, hemisphere rainfall distribution, and global sea level rise," says Dr. Cai, first author of the study published in Science Bulletin

Dr. Cai and his team sought to determine what causes the large inter-model differences by synthesizing recent advances, and through examination of available outputs from latest models participating in the Phase 6 of Coupled Model Intercomparison Project. The team surveyed a large body of literature and performed extensive analysis of data from some 30 participating models.

The team found that the large inter-model spread is not simply due to differences in the climate sensitivity, which measures the amount of radiative heating required to raise the Earth surface temperature by 1.0 oC. Previous studies suggest sequestration of heat by the Southern Ocean is mainly through mean circulation that facilitates uptake of additional heating, and that circulation changes because of greenhouse warming play a relatively small role.

The new analysis instead shows that circulation changes contribute to much of the inter-model differences beyond that are attributable to climate sensitivity. For example, under greenhouse warming, the prevailing Southern Ocean westerly intensifies toward the Antarctic, but the wind changes are vastly different across models.

The wind intensification induces changes in the intensity and distribution of upwelling, with serious consequences. An increase in upwelling accelerates melt of Antarctic ice sheets and ice shelves, and the associated meltwater flux into the ocean leads to a more stratified upper ocean, which in turn slows heat and carbon uptake by the Southern Ocean. "There are many such complex interactions at work, some of which are poorly understood and not represented in models, contributing to the large uncertainty," Cai says.

...

https://phys.org/news/2023-06-southern-ocean-climatic-impacts.html
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #128 on: June 24, 2023, 09:34:55 AM »
Is climate change outpacing our ability to predict extreme heatwaves?

...

But every so often, an event is so extreme it causes scientists to question our understanding of just how fast climate change is progressing. One such event was the heatwave across the Pacific Northwest region of the United States and Canada in the northern summer of 2021, when temperatures at some locations hit 49℃ (121℉) – hotter than the all-time record for Texas.

It broke heat records by such a wide margin that scientists were quoted in the media saying they hadn’t expected to see temperatures so high in the Pacific Northwest until much later this century.

The basic concern for these scientists was that our computer climate models are best at simulating things that span large areas and long time periods, such as the annual average global temperature (what we broadly mean when we say “the climate”). They aren’t as good at simulating smaller-scale things such as an individual storm or hot wind (that is, “the weather”).

It’s not that our models can’t simulate small-scale weather – they’re basically the same models we use for weather forecasting – it’s just very computationally expensive to have them zoom in and run in “weather mode” to get a highly detailed simulation. It’s feasible for a seven-day weather forecast, but not for a century-long climate simulation.

Given this limitation, the scientists quoted in the media were concerned extreme weather events might be more sensitive to climate change than our models suggest.

Quantity matters too
While these concerns around the quality of our model simulations at weather-relevant scales are valid, what’s often overlooked is the quantity of model simulations involved. Given the natural variability in the climate system, scientists prefer not to rely on just one model simulation when making climate projections. Instead, they run a range of century-long simulations – from just a handful up to 50 or more for the most well-resourced modelling groups – and look at the range of possible outcomes.

For climate metrics such as the annual average global temperature, that’s enough simulations to capture the full range of possibilities. It’s a value that doesn’t vary much from year to year because it’s an average over the entire globe, so the climate change signal dominates over natural variability. To use a slightly more technical term, we say it has a high “signal-to-noise” ratio.

In contrast, the weather can vary greatly over relatively short time frames, and therefore has a very low climate signal-to-noise ratio. Something like the hottest day of the year at a given location is especially noisy, because small variations in the alignment of weather patterns can make all the difference between a regular hot day and a record-shattering one.

In this situation, many more simulations would be required to reliably estimate the upper limit on what extreme temperatures are possible.

How many simulations are enough?
To try and understand how many model simulations would be needed, our recently published research used a climate model to simulate 45,000 years’ worth of daily weather at Seattle-Tacoma airport in the Pacific Northwest.

We then went through a process of picking out 1,000 random samples of 100 years of data from this population of 45,000 years, then 1,000 samples of 500 years, 1,000 years, 5,000 years, and so on. For each sample, we wrote down the maximum daily temperature we found (that is, the record temperature produced in each of these sample simulations).

To our surprise, as the samples got bigger, the record temperatures we found showed little evidence of stabilising. They just continued to grow, indicating even samples spanning several thousand years are insufficient to capture the full range of possible extreme temperatures.

The reason we kept finding hotter days as the sample size grew is that the larger samples included more weather patterns. This meant there was a greater chance of producing a unique pattern with the near-perfect alignment of weather systems to generate even more heat at our fixed location. It turns out the weather patterns that produce the most extreme heat are very unique – and indeed far rarer than we’d expected.

Luck of the draw
From this perspective, the record-shattering heat experienced in the Pacific Northwest in 2021 was due not just to the overall trend of global heating, but also to the random shuffling of the weather. And our research suggests the latter factor plays an even larger role in this type of event than many climatologists had suspected.

This means that even though the Pacific Northwest heatwave broke records by such a wide margin, that is not necessarily a sign climate change is happening faster than expected, or that our models are doing a bad job of simulating how climate change increases the likelihood of extreme heatwaves.

It could simply be that our sample sizes are too small. If we had run more model simulations we could have simulated the right chance alignment of weather to generate a record-shattering day, meaning this real-life heatwave wouldn’t then have outstripped climatologists’ predictions to such an extent.

Advances in supercomputers have traditionally been used to run climate models at higher resolution (that is, to zoom in and get closer to “weather mode”). But when it comes to predicting just how extreme the weather can get in a warming world, we might get more bang for our buck by using those advances to run many more simulations as well. That will show us what kind of extreme heat is possible as a rare event now, and what will be more commonplace in the coming decades.

https://theconversation.com/is-climate-change-outpacing-our-ability-to-predict-extreme-heatwaves-207925
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #129 on: August 13, 2023, 09:19:56 AM »
@  kassy  .... so the models are underestimating reality ?

They are incomplete so they give a result from the things we model but how accurate that is depends on how complete the model is. For glaciers the prediction for melt from just atmospheric and ocean temps will be an underestimate of what actually happens because:
It does not include the effect of algae growth. It also does not include the effects of dust or smoke particles in the mix and some of the newer internal ice sheet dynamics are also not factored in.
 
For sea level rise all kinds of effects in Antarctica are not factored in etc.

On the large scale we can´t model clouds at the usual resolution.

In the ASLR thread this was a recurring theme.
https://forum.arctic-sea-ice.net/index.php/topic,2205.0.html

The poster ASLR produced some threads from there:
https://forum.arctic-sea-ice.net/index.php/topic,3563.0.html

https://forum.arctic-sea-ice.net/index.php/topic,3559.0.html

https://forum.arctic-sea-ice.net/index.php/topic,3558.0.html

The first two are most important. The second post in those threads has an index.

One thing that is also missing from the GCM discussion is the fact that we more or less assume things to go on as they did while that is obviously not true. We should really work out what happens if you run the world at 1,5C for a decade but well we did not.

And even if we did we would still be missing the correct cloud input and that is rather important because it can make a huge difference (see discussion in Ocean temps).

Models are used in Paleo and in that context the extreme situations always have very high CO2 numbers. Since there is now a method to get accurate data from fossil stomata we should be able to check some of those. In the one example we have the actual CO2 value was about a third under the modelled one which implies that the Earth system reacts stronger to CO2 then our incomplete model of them. This probably also holds true for the Paleo science discussing disappearance of certain cloud types. Use a similar correction and you go from 1200 to about 800 ppm CO2 but that was the level at which they completely disappeared. It is impossible to work out when the changes began from the historic data and we can´t model it for resolution.
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

kiwichick16

  • Nilas ice
  • Posts: 1039
    • View Profile
  • Liked: 99
  • Likes Given: 41
Re: Validation of GCM Models
« Reply #130 on: August 14, 2023, 08:34:15 AM »
@  kassy   thanks for that info........ it doesn't make me any happier .... but does confirm what i had been thinking

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #131 on: September 19, 2023, 01:48:27 PM »
How Trees Influence Cloud Formation

As part of the international CLOUD project at the nuclear research centre CERN, researchers at PSI have identified so-called sesquiterpenes – gaseous hydrocarbons that are released by plants – as being a major factor in cloud formation. This finding could reduce uncertainties in climate models and help make more accurate predictions. The study has now been published in the journal Science Advances.

...

To form the droplets that make up clouds, water vapour needs condensation nuclei, solid or liquid particles on which to condense. These are provided by a wide variety of aerosols, tiny solid or liquid particles between 0.1 and 10 micrometres in diameter, which are produced and released into the air both by nature and by human activity. These particles can include salt from the sea, sand from the desert, pollutants from industry and traffic, or soot particles from fires, for example. However, about half the condensation nuclei are actually formed in the air when different gaseous molecules combine and turn into solids, a phenomenon that experts call “nucleation” or “new particle formation” (NPF). To begin with, such particles are tiny, barely larger than a few nanometres, but over time they can grow through the condensation of gaseous molecules and then serve as condensation nuclei.

...

The main anthropogenic gas that contributes to the formation of particles is sulphur dioxide in the form of sulphuric acid, mainly from burning coal and oil. The most important natural gases involved are so-called isoprenes, monoterpenes and sesquiterpenes. These are hydrocarbons that are mainly released by the vegetation. They are key components of the essential oils that we smell when, for example, grass is cut or we go for a walk in the woods. When these substances oxidise, i.e. react with ozone, in the air they form aerosols.

“It should be noted that the concentration of sulphur dioxide in the air has decreased significantly in recent years due to stricter environmental legislation and it will continue to decrease,” says Lubna Dada, an atmospheric scientist at PSI. “The concentration of terpenes, on the other hand, is increasing because plants release more of them when they experience stress – for example when there is an increase in temperatures and extreme weather conditions and vegetation is more frequently exposed to droughts.” The big question for improving climate predictions is therefore which of the factors will predominate, leading to an increase or a decrease in cloud formation. To answer this, one would need to know how each of these substances contributes to the formation of new particles. A great deal is already known about sulphuric acid, and the role of monoterpenes and isoprene is now also understood better thanks to measurements in the field and chamber experiments like CLOUD, in which PSI has been involved.

Sesquiterpenes are rare but effective
Until now, sesquiterpenes have not been a focus of research. “This is because they are quite difficult to measure,” explains Dada. “Firstly because they react very quickly with ozone, and secondly because they occur much less frequently than the other substances.” Around 465 million tonnes of isoprene and 91 million tonnes of monoterpenes are released every year, whereas sesquiterpenes account for just 24 million tonnes. Nevertheless, the new study, of which Dada is the lead author, has shown that these compounds play an important role in cloud formation. According to the measurements, they form ten times more particles than the other two organic substances at the same concentration.

To determine this, Dada and her coauthors used the unique CLOUD chamber at the European Organisation for Nuclear Research, CERN. The chamber is a sealed room in which different atmospheric conditions can be simulated. “At almost 30 cubic metres, this climate chamber is the purest of its kind worldwide,” says Dada. “So pure that it allows us to study sesquiterpenes even at the low concentrations recorded in the atmosphere.”

This was precisely what the study set out to do. It was designed to simulate biogenic particle formation in the atmosphere. More specifically, researchers were interested in studying pre-industrial times, when there were no anthropogenic sulphur dioxide emissions. This allows the effect of human activities to be determined more clearly and projected into the future. However, anthropogenic sulphur dioxide has long since become ubiquitous in nature. This is another reason why only the CLOUD chamber was viable. It also allows a pre-industrial mixture to be produced under controlled conditions.

The experiments revealed that the oxidation of a natural mixture of isoprene, monoterpenes and sesquiterpenes in pure air produces a large variety of organic compounds– so-called ULVOCs (Ultra-Low-Volatility Organic Compounds). As the name suggests, these are not very volatile and therefore form particles very efficiently, which can grow over time to become condensation nuclei. The enormous effect of sesquiterpenes was revealed when the researchers added sesquiterpenes into the chamber with a suspension of only isoprenes and monoterpenes. Even adding just two percent doubled the rate of new particle formation. “This can be explained by the fact that a sesquiterpene molecule consists of 15 carbon atoms, while monoterpenes consist of only ten and isoprenes only five,” says Dada.

On the one hand, the study reveals another mean by which vegetation can influence the weather and climate. Above all, however, the research results suggest that sesquiterpenes should be included as a separate factor in future climate models, alongside isopren and monoterpenes, to make their predictions more accurate. This is particularly true in light of the decrease in atmospheric sulphur dioxide concentrations and the simultaneous increase in biogenic emissions as a result of climate stress, meaning that the latter is likely to become increasingly important for our future climate. However, other studies are also needed to further improve cloud formation predictions. These are already being planned at the Laboratory for Atmospheric Chemistry. “Next,” says Imad El Haddad, Group Leader for Atmospheric Molecular Processes, “we and our CLOUD partners want to investigate what exactly happened during industrialisation, when the natural atmosphere became increasingly mixed with anthropogenic gases such as sulphur dioxide, ammonia and other anthropogenic organic compounds.”

https://www.eurasiareview.com/10092023-how-trees-influence-cloud-formation/
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

Bruce Steele

  • Young ice
  • Posts: 2556
    • View Profile
  • Liked: 773
  • Likes Given: 42
Re: Validation of GCM Models
« Reply #132 on: September 19, 2023, 06:32:30 PM »
Re. Sesquiterenes in cannabis.
https://www.researchgate.net/figure/Monoterpenes-and-Sesquiterpenes-Contents-of-Cannabis-Volatile-Oils_tbl1_341504561

I have 46 acres of Cannabis in bloom next door and the amount of terpenes and sesquiterpenes in the air is substantial. If ever there was a plant to selectively breed for sesquiterpene production cannabis is the plant to manipulate. There is also a requirement for cannabis producers to pump more terpenes into the air to mask the strong odors of weed in bloom. It doesn’t work .
 Is there interest in sesquiterenes as a geoengineering tool?

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #133 on: September 19, 2023, 10:19:29 PM »
Not yet but can sort of see the white paper. In general you get more when growing plants and that was the plan anyway. We can also wonder at the other part of the equation. Some level will be missing in Canada compared to the baseline because so much trees burned there.
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

Bruce Steele

  • Young ice
  • Posts: 2556
    • View Profile
  • Liked: 773
  • Likes Given: 42
Re: Validation of GCM Models
« Reply #134 on: September 22, 2023, 01:30:08 AM »
Kassy, The 46 acres of cannabis next door yields 14,000 lbs of oil . If that oil contained 50%
Sequiterpenes , what would the effect on cloud formation be if the volatile oils were released at very high altitudes . From my very meager understanding the sequiterpenes would bind with ozone and the resulting condensates would form nuclei of water droplets and clouds. High altitude clouds cool the planet so potentially sequiterpenes might both consume ozone but also contribute to cloud formation, and potentially albedo effects.
 Outside box

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #135 on: September 22, 2023, 06:16:37 PM »
The problem with that is that it would be a rather long term program.

They waft into the air from plants and then do their work so i wonder if at some point the effect could be detected from all the areas now suddenly growing cannabis. Some ends up in the oil but not all of it?

My interest in the CLOUD project is more in their future steps:

“Next,” says Imad El Haddad, Group Leader for Atmospheric Molecular Processes, “we and our CLOUD partners want to investigate what exactly happened during industrialisation, when the natural atmosphere became increasingly mixed with anthropogenic gases such as sulphur dioxide, ammonia and other anthropogenic organic compounds.”

All our recent historical data have industrialisation data in them. When we have a much better idea of the interactions between these gasses we can much better calculate what actually happened. That will take time too and we do have hints there was less margin in the system then we banked on.
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

morganism

  • Nilas ice
  • Posts: 2031
    • View Profile
  • Liked: 235
  • Likes Given: 143
Re: Validation of GCM Models
« Reply #136 on: September 22, 2023, 08:21:12 PM »
There were a couple studies done on kelp formations wafting aerosols for rain droplet formation. I think i posted one in the aerosol thread a couple years ago. Could give you a baseline for uptake in atmo....

Bruce Steele

  • Young ice
  • Posts: 2556
    • View Profile
  • Liked: 773
  • Likes Given: 42
Re: Validation of GCM Models
« Reply #137 on: September 22, 2023, 10:35:55 PM »
“Sesquiterpenes are rare but effective
Until now, sesquiterpenes have not been a focus of research. “This is because they are quite difficult to measure,” explains Dada. “Firstly because they react very quickly with ozone, and secondly because they occur much less frequently than the other substances.” Around 465 million tonnes of isoprene and 91 million tonnes of monoterpenes are released every year, whereas sesquiterpenes account for just 24 million tonnes. Nevertheless, the new study, of which Dada is the lead author, has shown that these compounds play an important role in cloud formation. According to the measurements, they form ten times more particles than the other two organic substances at the same concentration.”

So an important part of how plants contribute to cloud formation has been discovered because they had an instrument capable with reproducing the low levels of sequiterpenes measured in the atmosphere and could measure the result in particles produced. It also says the sequiterpenes are highly reactive with ozone so transporting them to high altitudes artificially might result in much higher concentrations than the sequiterpenes in aerosols transported there  by natural processes. Because they would have been reacting with ozone at low altitudes first.
 It reminds me of when people came to grips with acidification and how much difference it made to have an instrument that could measure the saturation state of seawater in real time so aquaculture could respond.
 If we are going to grow trees and plants to modify the carbon cycle it is paramount that we understand the existing system first. It may be possible to modify but one part of the natural  carbon cycle to great effect.( Twenty four million tones , error should be 24 billion tones . Fixed sept 23 )seems like a modest number to conjure up compared to the scale required for other geoengineering ideas like alkalinity modification of seawater, an idea I like but the scale of material necessary boggles the mind . In comparison a few million tones of sequiterpenes seems like nothing.
 Laughing at myself, as though anything on the scale of geoengineering is trivial.

 
« Last Edit: September 23, 2023, 10:16:32 PM by Bruce Steele »

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #138 on: September 23, 2023, 05:03:37 PM »
Good point. Hadn´t considered it that way.


There were a couple studies done on kelp formations wafting aerosols for rain droplet formation. I think i posted one in the aerosol thread a couple years ago. Could give you a baseline for uptake in atmo....

Thanks for the heads up. I will dig around for them later!
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

Bruce Steele

  • Young ice
  • Posts: 2556
    • View Profile
  • Liked: 773
  • Likes Given: 42
Re: Validation of GCM Models
« Reply #139 on: September 23, 2023, 05:51:44 PM »
https://www.science.org/doi/10.1126/sciadv.adi5297
 “Here, we also observe that, for the same OOM concentration, the addition of β-caryophyllene in the mixture of α-pinene + isoprene, even at parts per trillion by volume (pptv) levels, enhances J1.7 to the same level as of pure α-pinene, overcoming the isoprene suppression effect (25). “
Open access “Role of sesquiterpenes in biogenic new particle formation”
Dada et al 8 Sept. 23
« Last Edit: September 23, 2023, 06:08:34 PM by Bruce Steele »

Bruce Steele

  • Young ice
  • Posts: 2556
    • View Profile
  • Liked: 773
  • Likes Given: 42
Re: Validation of GCM Models
« Reply #140 on: September 24, 2023, 05:45:49 PM »
“On a global scale, emissions of sesquiterpenes are estimated to be 24 Tg C year−1 compared with 91 Tg C year−1 for monoterpenes and 465 Tg C year−1 for isoprene”
Quote from the Dada paper.

“ Around 465 million tonnes of isoprene and 91 million tonnes of monoterpenes are released every year, whereas sesquiterpenes account for just 24 million tonnes. “
Quote from the original Eurasiareview post by Kassy above.
 So I got tripped up by the numbers before I read the paper. Anyhow some science writer should review millions of tonnes , giga tonnes, and Tg tonnes.
 In reading the Dada paper it says that tropical forests together with areas that advect surface air high into the atmosphere are the largest contributors.
 Again I am operating without enough knowledge but I think there is a big gap between how many terpenes and biogenic volatiles are produced and what percentage reaches high altitudes where clouds that can  cool the planet are formed.
 If sequiterpenes can be artificially released where they can best advantage the high altitude nucleation of water droplets then the fact that they are effective at concentrations in the trillionths makes their use potentally viable. Maybe I am sounding like some wing nut and I should wait till someone with a better grip on geoengineering pipes in .
 Geoengineering is the final screwup if it isn’t continued in perpetuity ( or very long timescales ) but let’s just imagine a crop that produces enough biogenic volatiles that are light enough that something as simple as hot air balloons could be used to transport the volatiles( sequiterpenes) to altitudes where they could help cool our planet.
« Last Edit: September 24, 2023, 06:25:05 PM by Bruce Steele »

kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #141 on: November 04, 2023, 01:46:06 PM »
Patterns of Surface Warming Matter for Climate Sensitivity


One of the grand challenges in climate science is to reduce uncertainty in estimates of climate sensitivity, which quantifies how much Earth’s surface warms in response to a doubling of carbon dioxide relative to preindustrial levels. This uncertainty is large because climate sensitivity aggregates myriad processes, from microscale aerosol-cloud interactions to planetary-scale atmospheric and ocean circulations, into one number. Clouds, which are notoriously difficult to measure and simulate, are the main driver of the uncertainty.

Various lines of evidence are used to estimate climate sensitivity, including climate model simulations of varying complexity, observations over the past century, proxies that measure climate change in the distant past, and theory. The likely range of estimates of climate sensitivity was stubbornly constant at a distressingly imprecise 1.5–4.5 K for several decades, but the research community’s efforts have recently chipped away at this range (Figure 1).

Early in the 2010s, a substantial discrepancy was noted between estimates of climate sensitivity derived from climate models and estimates based on the observed warming record and radiative balance, the balance between incoming and reflected solar radiation and outgoing terrestrial radiation. Estimates based on observed warming pointed to much lower values than those derived from models. A key breakthrough toward solving this conundrum has been the recognition of the pattern effect, the process whereby climate sensitivity depends on the geographic pattern of surface warming. This advance was rated as one of the most promising avenues for further constraining climate sensitivity in the future [Forster et al., 2021].

Forcing, Feedbacks, and Climate Sweet Spots
Adding greenhouse gases to Earth’s atmosphere leads to a global energy surplus (less terrestrial radiation escapes to space), referred to as forcing. To restore the energy balance, the planet must warm. But warming causes changes in the climate system: The concentration of water vapor—a greenhouse gas—in the atmosphere increases, the spatial coverage of highly reflective snow and sea ice decreases, and cloud properties change. These and other radiative feedbacks amplify or dampen how much the planet warms in response to the forcing. Hence, for a given forcing, the feedbacks determine the climate sensitivity.

For decades, researchers assumed that global mean radiative feedbacks mostly depend on global mean temperature [Gregory et al., 2004]. However, they also depend on the spatial pattern of surface warming: Much like applying a force uniformly over someone’s entire body will elicit a very different reaction than tickling the soles of that person’s feet, a degree of global warming spread out evenly will cause a different radiative response than if that same warming were concentrated in a climate sweet spot (a location where surface warming produces efficient radiative damping).

A wide variety of processes affect the evolution of surface temperature, from greenhouse gas forcing and regional aerosol forcing to natural oscillations involving the ocean and atmosphere to the continental boundary conditions and the extent of ice sheets and sea ice. The pattern of surface temperature change over the past 40 or so years featured a pronounced spatial structure, with some locations even cooling in spite of the global mean warming on the order of 1 K (Figure 2, bottom left).

Feedbacks involving clouds and the atmospheric temperature structure are most sensitive to spatial differences in warming. Deep convection in the warmest tropical regions readily communicates surface conditions upward throughout the troposphere (up to about 10–15 kilometers) and then horizontally across much of the globe, making the western Pacific a climate sweet spot. This warmer air sitting atop the relatively cool waters in the eastern Pacific or Southern Ocean acts to stabilize the lowermost troposphere, allowing more extensive low-lying stratus and stratocumulus clouds to develop. Because of their location and structure, these low clouds efficiently cool the planet and offset some of the initial warming (Figure 2, top left).

Historically, three strands of research have highlighted the dependence of radiative feedbacks on the surface warming pattern. The first strand came from analyses of climate feedbacks and sensitivity in model simulations of unequilibrated, transient climate change. If feedbacks were constant, the expected equilibrium temperature change (climate sensitivity) could be estimated using a very simple energy balance model that linearly extrapolates the relationship between global temperature change and radiative imbalance [e.g., Gregory et al., 2004]. When longer, fully equilibrated simulations became available, it became evident that the simple estimation methods assuming constant feedbacks systematically underestimate the actual equilibrium climate sensitivity. The reason for this underestimation is indeed the evolution of the surface warming pattern, which initially emphasizes more stabilizing radiative feedbacks but later, during equilibration, emphasizes less stabilizing radiative feedbacks [e.g., Senior and Mitchell, 2000; Rugenstein et al., 2020].

The second strand related the idea of constant feedbacks to the efforts of estimating equilibrium climate sensitivity from the historical record, as mentioned above. Feedbacks calculated from observations or from atmosphere-only model simulations forced with the observed surface warming pattern over the past couple of decades imply less warming than those from model simulations with a fully interactive ocean, which have the freedom to create their own surface warming patterns [e.g., Gregory et al., 2020].

The third strand of research came from oceanography, showing that the atmospheric cooling effect of ocean heat uptake differs depending on where it occurs: One unit of ocean heat uptake in high latitudes cools Earth more effectively than the same unit taken up by the low-latitude oceans. This difference is relevant because the largest heat uptake by the ocean occurs at higher latitudes. The effect, termed ocean heat uptake efficacy, turns out to be another manifestation of the dependence of radiative feedbacks on surface temperature patterns [Winton et al., 2010; Lin et al., 2021].

The three strands of research have converged over the past few years, highlighting that understanding the pattern effect benefits from—and perhaps requires—contributions from virtually all climate research communities studying large-scale ocean-atmosphere coupling and the dynamics that set regional to global responses to external forcing.

...

In addition to our growing knowledge of the pattern effect, we also have learned that radiative feedbacks depend on global mean temperature itself: Warming Earth by 1 K from the LGM emphasizes different feedbacks (e.g., the sea ice albedo feedback) than warming by 1 K from a Miocene hothouse world or warming from 4 to 5 K in a high-emission scenario in a century or two from today (e.g., the water vapor feedback [Bloch-Johnson et al., 2021]). The pattern effect and the feedback temperature dependence add uncertainties to estimates of climate sensitivity based on the paleorecord, but quantifying their effects would make these records more relevant to constraining climate sensitivity and expected future warming.

...

And a whole lot more.

https://eos.org/features/patterns-of-surface-warming-matter-for-climate-sensitivity

Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

SeanAU

  • Young ice
  • Posts: 2880
    • View Profile
    • Meta-Crisis
  • Liked: 118
  • Likes Given: 27
Re: Validation of GCM Models
« Reply #142 on: January 16, 2024, 09:45:52 AM »
Ripping the can open to see what's really inside there?

Workshop on Confronting Earth System Model Trends with Observations:
 The Good, the Bad, and the Ugly

For over 40 years, and through several rounds of IPCC reports, the climate science community has made projections of climate change under specific emissions scenarios.  While assessments of the fidelity of Earth System Model simulations over the historical period have been performed for basic variables such as near surface air temperature, internal variability and a relatively small signal in a short observational record has made a comprehensive assessment challenging.

Our observational record is increasing in length, the climate change signal is increasing in size, and with increased computing capacity, large ensembles of Earth System Model simulations are becoming more standard. While challenges still exist, we are now in a unique position to confront Earth System Models with the emerging signals of climate change over a much greater range of variables and Earth system components than has been previously done.

It is time for the research community to perform a broader stock-take of the modeled representation of historical trends than has been done previously and answer the following questions:

- Are the trends in observations and Earth System Models consistent?

- If inconsistencies exist, can we relate this to deficiencies in the representation of forced trends or of internal variability and understand the origins of the problem?

Confronting Earth System Model trends with observations across different variables is key to focusing the community on where inconsistencies occur, where surprises might lie, and where more work needs to be done.

https://usclivar.org/meetings/confronting-earth-system-model-trends

From: The Climate Variability and Predictability Program
It's wealth, constantly seeking more wealth, to better seek still more wealth. Building wealth off of destruction. That's what's consuming the world. And is driving humans crazy at the same time.

morganism

  • Nilas ice
  • Posts: 2031
    • View Profile
  • Liked: 235
  • Likes Given: 143
Re: Validation of GCM Models
« Reply #143 on: January 27, 2024, 11:54:45 PM »
 I used to not worry about climate change. Now I do.   ( i am a fan of Sabine H. Fab Phys)

In this video I explain what climate sensitivity is and why it is important. Climate sensitivity is a number that roughly speaking tells us how fast climate change will get worse. A few years ago, after various software improvements, a bunch of climate models began having a much higher climate sensitivity than previously. Climate scientists have come up with reasons for why to ignore this.
I think it's a bad idea to ignore this.

https://backreaction.blogspot.com/2024/01/i-used-to-not-worry-about-climate.html






(maybe tipping points Kassy? Thot it was modeling tho)


kassy

  • Moderator
  • First-year ice
  • Posts: 8588
    • View Profile
  • Liked: 2064
  • Likes Given: 2002
Re: Validation of GCM Models
« Reply #144 on: January 28, 2024, 09:36:04 PM »
Interesting and short discussion of the CMIP hot models. So it´s fine here.
Þetta minnismerki er til vitnis um að við vitum hvað er að gerast og hvað þarf að gera. Aðeins þú veist hvort við gerðum eitthvað.

SeanAU

  • Young ice
  • Posts: 2880
    • View Profile
    • Meta-Crisis
  • Liked: 118
  • Likes Given: 27
Re: Validation of GCM Models
« Reply #145 on: January 30, 2024, 01:30:38 AM »
I post this with reservations. Not for sensitive souls.
But unfortunately this is the 'psychology' of what we are all dealing with when it comes to GCM models and CMIP etc .

 

It's wealth, constantly seeking more wealth, to better seek still more wealth. Building wealth off of destruction. That's what's consuming the world. And is driving humans crazy at the same time.

kiwichick16

  • Nilas ice
  • Posts: 1039
    • View Profile
  • Liked: 99
  • Likes Given: 41
Re: Validation of GCM Models
« Reply #146 on: January 30, 2024, 07:00:27 AM »
10 out of 55 models    ......18 % 

Why are 18 % of the models being ignored ?

Easy to think of the strings being pulled in the background

Richard Rathbone

  • Nilas ice
  • Posts: 1765
    • View Profile
  • Liked: 390
  • Likes Given: 24
Re: Validation of GCM Models
« Reply #147 on: January 30, 2024, 12:48:26 PM »
10 out of 55 models    ......18 % 

Why are 18 % of the models being ignored ?

Easy to think of the strings being pulled in the background

Some of them don't match observations as well as others.

Z. Hausfather, K. Marvel, G.A. Schmidt, J.W. Nielsen-Gammon, and M. Zelinka, "Climate simulations: recognize the ‘hot model’ problem", Nature, vol. 605, pp. 26-29, 2022. http://dx.doi.org/10.1038/d41586-022-01192-2

There's a little bit of discussion on this in Gavin's latest Realclimate post, but if you want more detail on why and how the CMIP6 ensemble is constrained, this is the reference.

Sciguy

  • Nilas ice
  • Posts: 1976
    • View Profile
  • Liked: 239
  • Likes Given: 188
Re: Validation of GCM Models
« Reply #148 on: January 31, 2024, 04:11:18 AM »
It’s important to note that the 2023 temperatures are not outside of past climate model projections.  A recent post on Real Climate shows that the model runs from CMIP3 through CMIP6, including the 37 “not hot” CMIP models with sensitivities within the accepted sensitivity range of 2k to 4k, fit the 2023 global record temperature

https://www.realclimate.org/index.php/archives/2024/01/not-just-another-dot-on-the-graph-part-ii/

Quote
Not just another dot on the graph? Part II
16 JAN 2024 BY GAVIN 129 COMMENTS

Annual updates to the model-observation comparisons for 2023 are now complete. The comparisons encompass surface air temperatures, mid-troposphere temperatures (global and tropical, and ‘corrected’), sea surface temperatures, and stratospheric temperatures. In almost every case, the addition of the 2023 numbers was in line with the long term expectation from the models.



Climate science relies on multiple lines of evidence.  This study from 2020 explains why the “hot models” are discounted in the IPCC reports.

  https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2019RG000678

Quote
An Assessment of Earth's Climate Sensitivity Using Multiple Lines of Evidence
S. C. Sherwood, M. J. Webb, J. D. Annan, K. C. Armour, P. M. Forster, J. C. Hargreaves, G. Hegerl, S. A. Klein, K. D. Marvel, E. J. Rohling, M. Watanabe, T. Andrews … See all authors
First published: 22 July 2020 https://doi.org/10.1029/2019RG000678

Quote
Abstract
We assess evidence relevant to Earth's equilibrium climate sensitivity per doubling of atmospheric CO2, characterized by an effective sensitivity S. This evidence includes feedback process understanding, the historical climate record, and the paleoclimate record. An S value lower than 2 K is difficult to reconcile with any of the three lines of evidence. The amount of cooling during the Last Glacial Maximum provides strong evidence against values of S greater than 4.5 K. Other lines of evidence in combination also show that this is relatively unlikely. We use a Bayesian approach to produce a probability density function (PDF) for S given all the evidence, including tests of robustness to difficult-to-quantify uncertainties and different priors. The 66% range is 2.6–3.9 K for our Baseline calculation and remains within 2.3–4.5 K under the robustness tests; corresponding 5–95% ranges are 2.3–4.7 K, bounded by 2.0–5.7 K (although such high-confidence ranges should be regarded more cautiously). This indicates a stronger constraint on S than reported in past assessments, by lifting the low end of the range. This narrowing occurs because the three lines of evidence agree and are judged to be largely independent and because of greater confidence in understanding feedback processes and in combining evidence. We identify promising avenues for further narrowing the range in S, in particular using comprehensive models and process understanding to address limitations in the traditional forcing-feedback paradigm for interpreting past changes.

While the one recent study about “hot models” being able to forecast recent weather better than the other CMIP6 models has gotten a lot of attention, other recent models highlighting problems with the “hot models” are ignored on this site.  Here’s one example:

https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2023GL105488

Quote
Connecting the SST Pattern Problem and the Hot Model Problem
Maria Rugenstein, Shreya Dhame, Dirk Olonscheck, Robert Jnglin Wills, Masahiro Watanabe, Richard Seager
First published: 22 November 2023 https://doi.org/10.1029/2023GL105488

Abstract
In the equatorial and subtropical east Pacific Ocean, strong ocean-atmosphere coupling results in large-amplitude interannual variability. Recent literature debates whether climate models reproduce observed short and long-term surface temperature trends in this region. We reconcile the debate by reevaluating a large range of trends in initial condition ensembles of 15 climate models. We confirm that models fail to reproduce long-term trends, but also find that many models do not reproduce the observed decadal-scale swings in the East to West gradient of the equatorial Pacific. Models with high climate sensitivity are less likely to reproduce observed decadal-scale swings than models with a modest climate sensitivity, possibly due to an incorrect balance of cloud feedbacks driven by changing inversion strength versus surface warming. Our findings suggest that two not well understood problems of the current generation of climate models are connected and we highlight the need to increase understanding of decadal-scale variability.





Sciguy

  • Nilas ice
  • Posts: 1976
    • View Profile
  • Liked: 239
  • Likes Given: 188
Re: Validation of GCM Models
« Reply #149 on: January 31, 2024, 04:20:12 AM »
One method of determining climate sensitivity is to use emergent constraints based on observations of past temperature variations.  This article from October 2023 evaluate temperatures over the past 1,000 years to arrive at a likely sensitivity of 2.5 to 2.7, well within the accepted range of 2 to 4.5.

https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2023GL104126

Quote
Revisiting a Constraint on Equilibrium Climate Sensitivity From a Last Millennium Perspective
S. Cropper, C. W. Thackeray, J. Emile-Geay
First published: 26 October 2023 https://doi.org/10.1029/2023GL104126

Abstract
Despite decades of effort to constrain equilibrium climate sensitivity (ECS), current best estimates still exhibit a large spread. Past studies have sought to reduce ECS uncertainty through a variety of methods including emergent constraints. One example uses global temperature variability over the past century to constrain ECS. While this method shows promise, it has been criticized for its susceptibility to the influence of anthropogenic forcing and the limited length of the instrumental record used to compute temperature variability. Here, we investigate the emergent relationship between ECS and two metrics of global temperature variability using model simulations and paleoclimate reconstructions over the last millennium (850–1999). We find empirical evidence in support of these emergent relationships. Observational constraints suggest a central ECS estimate of 2.5–2.7 K, consistent with the Intergovernmental Panel on Climate Change's consensus estimate of 3K. Moreover, they suggest ECS “likely” ranges of 1.7–3.3 K and 1.9–3.5 K.