Looking at a system by means of distinct investigative tools (e.g., models) creates the possibility to compare their results with respect to the study of a common problem. This is precisely what occurs in climatic attribution through the use of ensembles of Global Climate Models (GCMs).
Global Climate Models (GCMs) are the standard tools to investigate the Earth’s climate behavior. They are based on our theoretical knowledge (in terms of equations) of the functioning of the single subsystems which form the climate system. There are many GCMs developed by many research centers around the world and all of them give us the same outcome in the investigation of the causes of the recent global warming: if the anthropogenic forcings were fixed at their pre-industrial levels, the recent warming would not have happened.
Because all GCMs provide this common result, it can be considered a clear signal of the robustness of this result, provided that given conditions, such as the independence of the models involved, hold. However, dynamical models do not seem so independent, given that they are all historically linked together by a common origin and relations of mutual generation, they follow the same modelling/methodological (e.g. decomposition-recomposition) scheme, etc.
In a recent opinion paper published in WIREs Climate Change, Mazzocchi and Pasini discuss the difficulties of applying, through the use of ensembles of GCMs, the idea of robustness in climatic attribution studies, i.e., the search for the fundamental causes of recent global warming.
They propose the adoption of a different scheme of investigation, i.e., a “multi-approach” strategy, which could better achieve the condition of independence and then lead to more genuinely robust results. This depends on the fact that such a strategy is based on the combination of GCMs – of course – but also data-driven models which have been applied recently to the study of the attribution problem. The authors refer to neural network models and Granger causality analyses. These means of investigation are borrowed by artificial intelligence and econometric analyses, respectively. They are based on data and not on our previous dynamical knowledge of the climatic subsystems.
What is impressive is that through an ensemble of dynamical models and another ensemble constituted of two data-driven models we obtain the same result: if the anthropogenic forcings are set to preindustrial values, the recent increase in temperature disappears and the temperature behavior remains quite constant. The reliability of such a result is strengthened by the fact that is obtained through two different, independent ensembles of models, each capable of functioning as a plausible mean to perform attribution studies. On account of this, the authors claim that the multi-approach strategy should be considered in future studies on climate attribution.
Incidentally, this approach to robustness contributes also to the climate debate on the media, where the anthropogenic influence on the global warming is often denied, basing on the critics to the GCMs. Here Mazzocchi and Pasini show that, even using very different types of models, the final result is essentially the same: the man had and has a fundamental role in causing the recent global warming.
Kindly contributed by Fulvio Mazzocchi & Antonello Pasini.
Featured image courtesy of artist Germana Della Rocca of germart.wixsite.com/artworkofjermana