Judith it is such a pleasure to read a truly objective thought process on this topic. I really thank you for bringing Matt’s article to us to read. It is clarifying and allows us to see where the flaws in the current process have been better. The logic of the IPCC is a mathematical logic executed in a vacuum. What I mean by that is that significant amounts of information remain highly uncertain that could have a tremendous effect. If the true purpose of the IPCC and climate scientists were to characterize the amount of heat that is attributable to CO2 the first thing would be (assuming a model could not be constructed that was shown to be trivially robust and powerful) would be to put bounds on all the other possibilities. The IPCC itself admits that clouds, oceans and a few other things are very unknown and have considerable possible input. However, I don’t see the focus in the documents of the IPCC or climate scientists in estimating these variables. Possibly this is because the data about these phenomenon are not available prior to 2000 in some cases and 1980 in others or not available at all even now.
The IPCC then attempts to assert their models are robust and that therefore proves the CO2 hypothesis however, an objective look at the models would say that it seems unbelievable the models are robust on the face of them considering the computational complexity and errors in the numerical processes as well as the number of assumptions in the underlying formulas. Each of the assumptions in the models represents a testable experiment that has not been carried out to my knowledge. It is not possible to believe the models are robust by looking at the assumptions and approach. THerefore one can take the position that the models are robust by looking at their results and seeing if it matches reality. This is a much weaker position as then we must look for close matching of new data to model data to be sure that the model is tracking the new data (not the data assumed to be incorporated into the models parameters.) I have not seen an objective study of model facility given some input data used as parameters and data not included that then is predicted to see if the models are able to deal with “new” data but even under this approach the very existence of the other data clouds the surety of the conclusions as the researchers clearly are aware of that data when they constructed the models. The best test is with new data. Unfortunately the “new” data post the creation of most of the models basic outlines and parameter fitting is primarily since 2000 and this data does not show the models in good light. In fact it lends credence to the idea the models are seriously lacking fundamental missing behaviors or have miscalculated the parameters to underlying processes or may be wrong entirely as to the equations underlying the atmospheric response.
Watt has clearly shown that an objective analysis cannot robustly conclude that CO2 is a root cause. There is way too much unknown to conclude we really have any idea why this excursion has happened. The models are not robust to provide any corroboration. Any objective analysis of them would have to be that they are deeply flawed and do not contribute to a solution at this time. Saying we just don’t know may be politically unacceptable but it is the only objective possible answer. If people want to know the answer then they must invest more time and money in studying and collecting data and it may take decades to get sufficient amount of data to accurately state the relative contribution or even the majority cause.
Gavin has said in posts to you that he is 110% sure of the attribution of CO2 to the temperature variance. That is a bizarre statement to me. He seems to base this on the models predictions which clearly is a false way to calculate those probabilities. It has become clear the models were missing ocean capacitance and flows that if true which seems indisputable then the attribution of prior causes in the models starts to come apart. For instance, if ocean flows do represent some of the heat during the modern warming then it implies some of the cooling in the period prior was related to this and then some of the warming prior to that is because of this effect. This means the attirbutions of CO2, aerosols, methane, other things used to explain those variations was wrong. Therefore as you have pointed out that means that even using Gavins “model based” approach to certainty the probability CO2 is the 110% cause of the modern warming is drastically reduced.
The modelers have assumed the “hockey stick” and therefore they then conclude the model is robust except for possibly the recent 15 years. However, as Matt points out it is now clear that the LIA and MWP seem to be real events. The hockey stick has been disproven and the PDO and AMO disturbances clearly show that significant effects exist in cycles in the system in hidden un-modeled variables. This is a serious blow to the models as they must be shown to explain these prior variations before they can assume to robustly model what is happening today. This alone is enough to discount the models entirely. It is impossible to know how much each of these capacitance type or other type inputs have contributed to the results and we are left at the point of saying again it is impossible to ascribe with certainty any number.
I believe this is the only possible objective analysis. Matt has proven this to my satisfaction. It is possibly regretable and politically inconvenient that the result is unsatisfactory. It seems to me the only legitimate argument is that CO2 has some contribution and if we wish to avoid any human contribution then we should take measures to limit human impact. This is a reasonable argument and backed by facts that at least some effect is coming from co2.