…
I could pull-up images from many prior posts to show that effective ECS and its impacts on GMSTA could be increasing already, but that consensus climate scientists fail to make such attributions.
…
It is my opinion that one major reason that consensus climate science exposes modern society to major unacknowledged climate change risks is because it emphasizes the use of either deductive or inductive reasoning, at the expense of abductive reasoning. Thus, I believe that climate science could learn a lot from the use of abductive learning in AI programs as discussed in the linked video and associated linked reference; in order to improve their attribution of feedback mechanisms such as ice-climate feedbacks within their ESM runs.
Title: "Wang-Zhou Dai: Bridging Machine Learning and Logical Reasoning by Abductive Learning"
Extract: "Perception and reasoning are two representative abilities of intelligence that are integrated seamlessly during problem-solving processes. In the area of artificial intelligence (AI), perception is usually realised by machine learning and reasoning is often formalised by logic programming. However, the two categories of techniques were developed separately throughout most of the history of AI. This talk will introduce the abductive learning framework targeted at unifying the two AI paradigms in a mutually beneficial way. In this framework, machine learning models learn to perceive primitive logical facts from the raw data, while logical reasoning is able to correct the wrongly perceived facts for improving the machine learning models. We demonstrate that by using the abductive learning framework, computers can learn to recognise numbers and resolve equations with unknown arithmetic operations simultaneously from images of simple hand-written equations. Moreover, the learned models can be generalized to complex equations and adapted to different tasks, which is beyond the capability of state-of-the-art deep learning models."
See also:
Title: "Bridging Machine Learning and Logical Reasoning by Abductive Learning", 2019
https://papers.nips.cc/paper/8548-bridging-machine-learning-and-logical-reasoning-by-abductive-learningAs background on this matter, generally speaking, there are three types of reasoning: deductive, inductive and abductive. Unlike induction or deduction, where science starts with cases to make conclusions about a rule, or vice versa, with abduction, scientists generate a hypothesis to explain the relationship between a case and a rule. Also, abduction assumes there is insufficient evidence to deduce any single explanation or cause. More concisely, in abductive reasoning, scientists make an educated guess, and illustrated by the following quote from Josephson & Josephson:
"Abduction, of inference to the best explanation, is a form of inference that goes from data describing something to a hypothesis that best explains or accounts for the data.
D is a collection of data (facts, observations, givens).
H explains D (would, if true, explain D).
No other hypothesis can explain D as well as H does.
... Therefore, H is probably true."
– Josephson & Josephson, Abductive Inference"
Part of what makes the application of abductive reasoning challenging is that scientists have to infer some likely hypotheses from a truly infinite set of explanations. The reason that this is significant is because when scientists are faced with complex problems, part of the way that humans solve them is by tinkering. Human typically play, trying several approaches, keeping one's value system fluid as we search for potential solutions. Specifically, we generate hypotheses. Where a computer might be stuck in an endless loop, iterating over infinite explanations, we use our value systems to quickly infer which explanations are both valid and likely.
Finally, conventional AI computers/machines face considerable difficulty in deciding what data is really information; which is also a problem that consensus climate scientists face when trying to attribute cause to the various mechanisms modeled within their ESM runs.