Archives For Uncertainty

NASA safety handbook cover

Way, way back in 2011 NASA published the first volume of their planned two volume epic on system safety titled strangely enough “NASA System Safety Handbook Volume 1, System Safety Framework and Concepts for Implementation“, catchy eh?

Continue Reading…

Process is no substitute for paying attention

As Weick has pointed out, to manage the unexpected we need to be reliably mindful, not reliably mindless. Obvious as that truism may be, those who invest heavily in plans, procedures, process and policy also end up perpetuating and reinforcing a whole raft of expectations, and thus investing in an organisational culture of mindlessness rather than mindfulness.

Continue Reading…

An articulated guess beats an unspoken assumption

Frederick Brooks

A point that Fred Brooks makes in his recent work the Design of Design is that it’s wiser to explicitly make specific assumptions, even if that entails guessing the values, rather than leave the assumption un-stated and vague because ‘we just don’t know’.

Brooks notes that while specific and explicit assumptions may be questioned, implicit and vague ones definitely won’t be. If a critical aspect of your design rests upon such fuzzy unarticulated assumptions, then the results can be dire. Continue Reading…

While I’m on the subject of visualising risk the Understanding Uncertainty site run by the University of Cambridge’s Winton Group gives some good examples of how visualisation techniques can present risk.

In June of 2011 the Australian Safety Critical Systems Association (ASCSA) published a short discussion paper on what they believed to be the philosophical principles necessary to successfully guide the development of a safety critical system. The paper identified eight management and eight technical principles, but do these principles do justice to the purported purpose of the paper?

Continue Reading…

20120722-182815.jpg

One of the canonical design principles of the nuclear weapons safety community is to base the behaviour of safety devices upon fundamental physical principles.

Continue Reading…

I’ve recently been reading John Downer on what he terms the Myth of Mechanical Objectivity. To summarise John’s argument he points out that once the risk of an extreme event has been ‘formally’ assessed as being so low as to be acceptable it becomes very hard for society and it’s institutions to justify preparing for it (Downer 2011).

Continue Reading…

Did the designers of the japanese seawalls consider all the factors?

In an eerie parallel with the Blayais nuclear power plant flooding incident it appears that the designers of tsunami protection for the Japanese coastal cities and infrastructure hit by the 2011 earthquake did not consider all the combinations of environmental factors that go to set the height of a tsunami.

Continue Reading…

Out of the Loop

14/08/2011 — 2 Comments

Out of the loop, aircrew and unreliable airspeed at high altitude

The BEA’s third interim report on AF 447 highlights the vulnerability of aircrew when their usually reliable automation fails in the challenging operational environment of high altitude flight.

This post is part of the Airbus aircraft family and system safety thread.

Continue Reading…

Why more information does not automatically reduce risk

I recently re-read the article Risks and Riddles by Gregory Treverton on the difference between a puzzle and a mystery. Treverton’s thesis, taken up by Malcom Gladwell in Open Secrets, is that there is a significant difference between puzzles, in which the answer hinges on a known missing piece, and mysteries in which the answer is contingent upon information that may be ambiguous or even in conflict. Continue Reading…

In a previous post I discussed that in HOT systems the operator will inherently be asked to intervene in situations that are unplanned for by the designer. As such situations are inherently not ‘handled’ by the system this has strong implications for the design of the human machine interface.

Continue Reading...

The past is prologue to the present

I’m currently reading a report prepared by MIT’s Human and Automation Labs on a conceptual design for the Altair lunar lander’s human machine interface. Continue Reading…

Why taking risk is an inherent part of the human condition

On the 6th of May 1968 Neil Armstrong stepped aboard the Lunar Lander Test Vehicle (LLTV) for a routine training mission. During the flight the vehicle went out of control and crashed with Armstrong ejecting to safety seconds before impact. Continue Reading…

For the STS 134 mission NASA has estimated a 1 in 90 chance of loss of vehicle and crew (LOCV) based on a Probabilistic Risk Assessment (PRA). But should we believe this number?

Continue Reading...

In a series of occasional posts on this blog, I’ve discussed some of the pitfalls of heuristics based decision making as well as the risks associated with decision making on incomplete information or in an environment of time pressure. As an aid to the reader I’ve provided a consolidated list here.

Continue Reading...

People tend to seek out and interpret information that reinforces their beliefs, especially in circumstances of uncertainty or when there is emotion attaching to the issue. This bias is known as confirmatory or ‘myside’ bias. So what can you do to guard against the internal ‘yes man’ that is echoing back your own beliefs?

Continue Reading...

This railway crossing near miss due to a driver ‘racing the devil’ is, on the face of it, a classic example of the perversity of human behaviour. But on closer examination it does illustrate the risk we introduce when transitioning from a regine of approved operational procedures to those that have been merely accepted or tolerated.

Continue Reading...

The Titanic effect

27/09/2010 — 1 Comment

So why did the Titanic sink? The reason highlights the role of implicit design assumptions in complex accidents and the interaction of design with operations of safety critical systems

Continue Reading...

The IPCC issued a set of lead author guidance notes on how to describe uncertainty prior to the fourth IPCC assessment. In it the IPCC laid out a methodology on how to deal with various classes of uncertainty. Unforunately the IPCC guidance also fell into a fatal trap.

Continue Reading...

One of the traditional approaches to reducing risk is to work on reducing the severity (S) of an accident rather than reducing the Likelihood (L) of occurrence. As the classical definition of risk is, Risk (R) = L X S, by reducing S we reduce the risk. Simple really… Continue Reading…

One of the tenets of safety engineering is that simple systems are better. Many practical reasons are advanced to justify this assertion, but I’ve always wondered what, if any, theoretical justification was there for such a position.

Continue Reading...

If you read through the International Panel on Climate Change (IPCC) reports you’ll strike qualitative phrases such as’likely’ and ‘high confidence’ to describe uncertainty. But is there a credible basis for these terms?

Continue Reading...

Risk - what is it?

What do an eighteenth century mathematician a twentieth century US Secretary of Defence have in common?

The answer is that both men thought about uncertainty and risk, and the definitions of risk and uncertainty that they separately arrived at neatly illustrate that there is a little more to the concept of risk than just likelihood multiplied by consequence. Continue Reading…