**Talking about climate risk first demands that we share a common vernacular**

If you read through the International Panel on Climate Change (IPCC) reports you’ll strike phrases such as:

“It is very* likely* that the response to anthropogenic forcing contributed to sea level rise during the latter half of the 20th century.”(IPCC 2007), and

“There is* high confidence* that natural systems related to snow, ice and frozen ground (including permafrost) are affected.”

At this point we start to realise that different terms are being used to describe various types of uncertainty. In fact the IPCC actually uses three separate scales; a qualitative matrix for *confidence of understanding*, a qualitative scale for *uncertainty* and finally a qualitative scale of likelihood (IPCC 2005).

In turn the last two qualitative scales are also quantitatively calibrated using odds and percentage probability ranges respectively.

So what exactly do these qualitative terms actually mean? Reading the IPCC guidance notes for uncertainty (IPCC, 2005) you find that terms such as ‘likely’ (1), ‘very likely’, ‘virtually certain’, are intended to express the likelihood of an event occurring in the future and that the term ‘likely’ in this case represents a probability bin of 66% to 90%.

One, slightly mechanistic, problem with the IPCC quantitative likelihood scale is that the probability bin ranges are actully unequal; 9%, 23%, 33%, 23% and 9% respectively.

This sort of construction can tend to bias probability selections, which may (or may not) be desirable and intended. Unfortunately the purpose of this biasing is not explained by the guidance notes.

So what should we then make of the term ‘high confidence’ as an expression of uncertainty?

Again referring to the IPCC guidance we find that this represents an assessment of uncertainty that is based on expert judgment as to the correctness of a model, an analysis or a statement (IPCC 2005) (2). The term ‘high confidence’ in this case represents a point probability of about eight out of ten.

Note that uncertainty is here expressed as ‘odds’ versus the more traditional Bayesian approach of generally expressing degree of belief as a percentage. Again the purpose of this difference is not explained nor is what is meant generally by the term correctness.

Yet another prolem with the unceertainty scale is that it is expressed as a point probability.

Experience with subjective estimation of probabilities, read expert judgement, shows that people find it easier to estimate when presented with a set of probability bins rather than point values.

One may also call into question the use of the term ‘correct’ in relation to the uncertainty of models. All models contain simplifying assumptions so fundamentally all models are incorrect to some extent, it’s just that some are less incorrect than others.

As a result while we can say one model is *better* at predicting an outcome we can’t say it is more *probable*, so the probability scale does not make sense in this case.

A more general question arises as to what is meant by the term *uncertainty*. For example, are we talking about the variability due to sampling from a frequency distribution? Or are we talking about empirical (epistemic) uncertainty that results from incomplete knowledge? Or both? Unfortunately the guidance notes are silent upon what is mean by this key term.

Finally the IPCC guidance notes are, unfortunately, just guidance and the implementation of the standardised terminology and definitions is left to the discretion of individual lead authors. Thus a document presumably intended to standardise key uncertainty definitions is not actually enforced as a standard.

The intent of introducing a standardised lexicon of terms to address uncertainty is a worthy one and has certainly improved the AR4 report relative to it’s predecessor.

But to me, by introducing terminology based on a loose set of definitions while allowing the various working groups to apply them as they see fit (or substitute their own) we still *do not* have a transparent, consistent and defensible approach to uncertainty.

As these reports will be the basis for decisions that affect every man women and child on this planet should we expect better?

**References**

IPCC, Climate Change 2007: Synthesis Report, IPCC Plenary XXVII, Valencia, Spain, 12-17 November 2007.IPCC, Guidance Notes for Lead Authors of the IPCC Fourth Assessment Report on Addressing Uncertainties, June 2005.

Manning, M.R., M. Petit, D. Easterling, J. Murphy, A. Patwardhan, H-H. Rogner, R. Swart, and G. Yohe (Eds), IPCC Workshop on Describing Scientific Uncertainties in Climate Change to Support Analysis of Risk and of Options: Workshop report. Intergovernmental Panel on Climate Change (IPCC), Geneva, 2004.

Wallsten, T.S., D V Budescu, D.V., Rappoport, A., Zwick, R., Forsyth, B., Measuring the vague meaning of probability terms (1986), Journal of Experimental Psychology, 1986.

**Notes**

1. Because these are qualitative terms the range over which people interpret them in a naturalistic setting actually varies and overlaps leading to ambiguity.

For example the term ‘possible’ median value range has in studies been qualitatively interpreted as any where from near 1.0 to near 0 probability (Wallsten 1986).

2. The final guidance notes scale for uncertainty (Table 3) also deviates from that proposed in the IPCC workshop. The workshop scale was much closer to that of Table 4’s quantitative probability bin.