"Correct to say" is a fine term, and probabilities based on mathematics is accurate. If you flip a coin 10 times, there is a 99% probability that it will comes up heads at least once. However, there is a finite possibility that it will not. The outcome, in no way chances the odds, nor does it falsify them.

When talking about the weather or climate, the probabilities are a bit more nebulous. When a forecaster gives a percentage chance of rain, it is not based on strict mathematics, but on past data. The less data available (or factors omitted), the less reliable are the projections. Additionally, there are unknowns that may influence the system, before the time arrives, changing the potential outcome.

With a truly random system and truly even random odds, you need only flip the coin 7 times to have a >99% chance that at least one of those times would come up heads. On average in a long series of trials 1 time in about 128 the coin will come up heads only once, with all of the rest being tails.

With 10 coin flips the odds are ~ 99.93%

However - and this is a huge however - it is unfortunately common for odds to not apply at all in studies, or for the situation to not be balanced, fully known and free of external influences.

Quite often systems may behave similarly to random processes, yet not be such at all. In those cases, statistical tests and bounds may be useful, but also be wrong. They can mislead us greatly.

Chaotic systems may for a time exhibit quasi random statistical like behavior within certain bounds. Deviate slightly and the statistics may suddenly not work at all.

Some systems exhibit what appears to be statistical noise and random behavior only to exhibit wildly non-random behavior that far exceeds the statistical bounds. So-called "stiff" equations in chemical engineering behave this way. They appear stable for long periods before some small input accumulates in effects and drives wild changes in the results.

Systems with unrecognized, hidden, or ignored state changes may also exhibit either true or apparent random behavior about a mean, but with changes in the underlying conditions, or slight changes in the mean can walk across a system boundary and suddenly completely change behaviors completely outside any statistical analysis of the previous behavior to that point. These are actually quite common. Traffic jams are one such example. With minor changes in the traffic volume, a sudden state change can occur. The traffic that previous flowed in ways that could be modeled like a gas, now behaves like a liquid, or even a highly viscous liquid, or a true solid.

Many of the parameters that we routinely use also have strange behaviors in conditions away from the usual. It is common in modeling to treat collections of discrete objects as a uniform flow field of a quasi-fluid. Tensor analysis often falls victim to this flaw. The usual examples are far from the common world, but are informative.

For example, though gases behave like a quasi-fluid flow field and are easily modeled as such, and even though they exhibit properties like temperature (the kinetic energy of the particles in the fluid); when the flow conditions change to rarified conditions, the energy transfer between the discrete particles breaks down. When that happens, simple ideas like temperature are revealed not to be static values, nor vectors, nor even tensor gradients. The temperature of gases under near vacuum conditions are one such example. They become highly directional, often with little or no correlation at large angles to the main flow direction.

The assumptions and presumptions that go into building an assessment or model are critically important for understanding the limitations of the results. Far too often in my experience researchers leave these unstated, and often unknown.

These ideas are often forgotten, neglected, ignored or dismissed in examining real world systems. And that error can be at our great peril.

To quote Monty Python, "Nobody expects the Spanish Inquisition!".

With the ice loss in the Arctic, there are a large number of unknown factors and variables. The changes from year to year exhibit quasi-random behavior around a moving mean. But that is a fiction. The reality is that it is a mostly deterministic system with random variation in many parameters, and chaotic inputs of many types. These all interact to exhibit the behavior we see and that we interpret as randomness. And that is useful, even though it is wrong.

We can over short timespans extrapolate from past behavior to anticipate future likely behavior. However, it is not a truly random variation, and these variations only act and appear random. We must always bear that in mind.

Additionally, our choice of definitions play large roles in what we see. We have arbitrarily decided to define what portion of ice cover in a given area constitutes being ice covered - essentially treated as 100% ice. Our usual rule for that is that if the area in question is 15% ice covered, then we treat it as 100% ice covered.

This then leads to strange results. If we take the same large segment of ice and shatter it, moving the shattered parts away from each other, the ice area remains the same, while the extent can increase dramatically.

Here near the end of the ice, as the ice thins, we see this playing out over large areas. The extent artificially remains much higher than it would have as a comparison to ice area in decades gone by.

While it made sense to have such a rule when the only place it applied was around the edges of the ice sheet, the rule now serves to mislead us when the whole of the ice sheet is disintegrating. That misleading statistic then can lead us to erroneously project that the ice will last longer that other metrics like ice volume and thickness imply.

Sam