Climate models are used, in part, to determine future climate change scenarios related to anthropogenic global warming (AGW) and are described by the Intergovernmental Panel on Climate Change (IPCC) as “mathematical representations of the climate system, expressed as computer codes and run on powerful computers”.
Furthermore, the IPCC states that climate models:
… are derived from fundamental physical laws (such as Newton’s laws of motion), which are then subjected to physical approximations appropriate for the large-scale climate system, and then further approximated through mathematical discretization. Computational constraints restrict the resolution that is possible in the discretized equations, and some representation of the large-scale impacts of unresolved processes is required (the parametrization problem).
In other words a climate model is a numerical model or simplified mathematical representation of the Earth’s climate system, or parts thereof. It includes data from real world observations and creates parameters or variables for the unresolved or unknown processes.
The ability of a model to simulate interactions within the climate system depends on not only the level of understanding of the physical, geophysical, chemical and biological processes that govern the climate system but on how accurately these processes are expressed as algorithms within the model, and how closely they represent real-world data.
These models do contain some well-established science but they also contain implicit and explicit assumptions, guesses, and gross approximations, referred to as parameters (the parametrization problem mentioned above), mistakes in any of which can invalidate the model outputs when compared to real world observations.
In other words computer models are just concatenations of theoretical calculations; as such they do not constitute evidence.
Climate models are data and parameters dependent. Data is based on direct or indirect observations from the environment; parameters (or parametrizations) are defined by the IPCC as:
… typically based in part on simplified physical models of the unresolved processes … Some of these parameters can be measured, at least in principle, while others cannot. It is therefore common to adjust parameter values (possibly chosen from some prior distribution) in order to optimise model simulation of particular variables or to improve global heat balance. This process is often known as “tuning”. [Author’s emphasis.]
Tuning is considered justifiable if two conditions are met: that parameter ranges do not exceed observational ranges where applicable (though this does not necessarily constrain parameter values, which could lead to model output problems); that adjusted (or tuneable) parameters are allotted less degrees of freedom than in the observational constraints used in the model’s evaluation. The IPCC states that:
… if the model has been tuned to give a good representation of a particular observed quantity, then agreement with that observation cannot be used to build confidence in that model. However, a model that has been tuned to give a good representation of certain key observations may have a greater likelihood of giving a good prediction than a similar model … that is less closely tuned. [Author’s emphasis.]
Herein lies a problem with modeling: that last sentence implies a subjective judgment on the part of the modeler regarding the greater likelihood of the model providing a good prediction than a less closely tuned similar model. In other words there is the possibility that tuneable parameters can be used as a “fudge” factor, in either model prediction or hindcasting (making the model fit already observed data).
Copyright Ian Read. All rights reserved. Fair use provisions apply.
Discuss in our Forums
See what other readers are saying about this article!
Click here to read & post comments.
16 posts so far.