Practical risk management in the presence of uncertainty

18/06/2015 — Leave a comment

Two_reactors

A tale of another two reactors

There’s been much debate over the years as whether various tolerance of risk approaches actually satisfy the legal principle of reasonable practicability. But there hasn’t to my mind been much consideration of the value of simply adopting the legalistic approach in situations when we have a high degree of uncertainty regarding the likelihood of adverse events. In such circumstances basing our decisions upon what can turn out to be very unreliable estimates of risk can have extremely unfortunate consequences.

Let’s consider the Fukushima Daichii disaster as an example. Now one of the clear points of escalation during the disaster was the hydrogen explosions that occurred due to the overheated zirconium alloy fuel cladding. This is a known and well understood scenario for a reactor loss of coolant event, in fact it had previously occurred during the Three Mile Island accident. So what should we do about it? Well that depends what decision making model you apply. If you apply a pure risk based assessment then given that the likelihood of a loss of coolant accident is sufficiently low you need do nothing more. Of course you also need to be aware that any such decision is based on your social perception of the risk and not the reality, and be comfortable with that position (and good luck).

Fukushima NPP March 17 (Image Source: )

If on the other hand you ask yourself what reasonably and practicably could be done, then clearly you could implement catalyst based hydrogen absorption units that can eliminate the explosive build up. Given the potential severity of the consequences, it’s very unlikely that the cost of such a control would be found to be grossly disproportionate to the benefit accrued. This is not to say that the reasonably practicable principle doesn’t have it’s own peculiar disadvantages, but when we are operating in a region of high epistemic uncertainty for high consequence systems perhaps we should pick a set of decision making criteria that is most immune to the effects of such uncertainty.

Perhaps another example will better illustrate the second approach. In 1957 the United Kingdom suffered the worst reactor accident in the history of it’s nuclear program. The Windscale fire as it came to be know occurred within the graphite core of reactor pile unit No. 1 after the build up of Wigner energy in the graphite of the pile. The fire burned for three days and there was a significant release of radioactive contamination across the UK and Europe leading to an estimated additional 240 cancer cases.

Sellafield (CC-BY-SA 2.0 Image Source: Chris Eaton)

But the Windscale accident could have been much, much worse than it was, initial plans for the air cooled reactor used a straight through air flow with the hot air from the core being exhausted straight up the exhaust stack. All good as long as the integrity of the fuel cladding was not breached, but if it were to be damaged then the fuel could catch on fire with catastrophic consequences. Terence Price one of the physicists on the project raised the issue during a design committee only to find that it was dismissed as too difficult to deal with and was not even recorded in the minutes. As Leonard Owen the chair of the committee put it, “Don’t be so silly lad…Two tons of air go up that stack every second. Can’t filter that.”.  Luckily, Sir John Cockcroft was sufficiently alarmed by the potential consequences to order that filters be installed contrary to the committee’s position. Given the chimneys had already been built this meant they needed to be put at the top of the stack at some considerable expense, earning them the monicker of “Cockcroft’s Folly” by workers and engineers alike. Until the day of the fire…

Clearly the committee’s lack of action was informed by a sense that such an event was so unlikely that nothing need be done, else how could a position of doing nothing be sustained by a group of otherwise rational, professionals? In other words at some level they had engaged in probabilistic thinking. Just as obviously had that been allowed to stand a good portion of the English countryside would now be as heavily contaminated with fallout as Chernobyl is today. Fortunately Cockcroft’s imagining of what might be possible ensured that this wasn’t the case. Another interesting fact about this second example is that it was not the original failure mode identified by Terence Price that triggered the fire, but (most likely) a Wigner energy hotspot, the risks of which were then poorly understood.

Albeit with the benefit of hindsight, it’s difficult to see how a realistic appreciation of the overall risk could have been arrived at given the developmental nature of the design, not to mention that certain parts of the reactor design were shrouded in secrecy. For example the reactor’s core also contained so called outer loops used for the production of Polonium a vital component for the UK’s H-bomb program, which were highly classified. Similarly the committee were not, nor could they have been, aware of the subsequent progressive removal of safety measures to increase the productivity of the reactor. In such circumstances, any probabilistic reasoning either implicit or explicit is inherently unreliable, and a precautionary possibilistic design, based on the possibilities inherent in the situation, will be more effective than gambling that an event will not occur.

The take home is that as the severity of the hazards increase the uncertainty around our estimation of both likelihood and severity also increases. Likewise when we develop new technologies we need to be aware that the degree of epistemic and ontological risk we expose ourselves to is that much greater. Finally we need to be aware of the significant adverse effects that biases and heuristics can have upon the assessment of such risks.  In such circumstances identifying what the worst outcome is and what we may reasonably and practicably do to mitigate it is probably the most robust approach that we can muster.

Note

I’d originally included this discussion into the preceding post, but in retrospect I think it deserves separate treatment.

No Comments

Be the first to start the conversation!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s