Archives For Safety culture

Fraud and framing

21/10/2013 — 1 Comment

In a slight segue, I was reading Bruce Schneier’s blog on security and came across this post on the psychology behind fraud. Bruce points to this post on why, yes I know, ‘good people do bad things’. The explanation that researchers such as Ann Tenbrunsel of Notre Dame offer is that in the same way that we are boundedly rational in other aspects of decision making so to are our ethical decisions.

In particular, the way in which decision problems were framed seems to have a great impact upon how we make decisions. Basically if a problem was framed without an ethical dimension then decision makers were much less likely to consider that aspect.

Additionally to framing effects, researchers found in studying collusion in fraud cases most people seem to act from an honest desire simply to help others, regardless of any attendant ethical issues.

What fascinates me is how closely such research parallels the work in system safer and human error. Clearly if management works within a frame based upon performance and efficiency, they are simply going to overlook the down side completely, and in a desire to be helpful why everyone else ‘goes along for the ride’.

There is as I see it a concrete recommendation that come out of this research that we can apply to safety; that fundamentally safety management systems need to be designed to take account of of our weaknesses as boundedly rational actors.

From Les Hatton, here’s how, in four easy steps:

  1. Insist on using R = F x C in your assessment. This will panic HR (People go into HR to avoid nasty things like multiplication.)
  2. Put “end of universe” as risk number 1 (Rationale: R = F x C. Since the end of the universe has an infinite consequence C, then no matter how small the frequency F, the Risk is also infinite)
  3. Ignore all other risks as insignificant
  4. Wait for call from HR…

A humorous note, amongst many, in an excellent presentation on the fell effect that bureaucracies can have upon the development of safety critical systems. I would add my own small corollary that when you see warning notes on microwaves and hot water services the risk assessment lunatics have taken over the asylum…

I’ve just finished up the working week with a day long Safety Conversations and Observations course conducted by Dr Robert Long of Human Dymensions. A good, actually very good, course with an excellent balance between the theory of risk psychology and the practicalities of successfully carrying out safety conversations. I’d recommend it to any organisation that’s seeking to take their safety culture beyond systems and paperwork. Although he’s not a great fan of engineers. :)

In a recent NRCOHSR white paper on the Deeepwater Horizon explosion Professor Andrew Hopkins of the Australian National University argued that the Transocean and BP management teams that were visiting the rig on the day of the accident failed to detect the unsafe well condition because of biases in their audit practices.

Continue Reading…

In an article published in the online magazine Spectrum Eliza Strickland has charted the first 24 hours at Fukushima. A sobering description of the difficulty of the task facing the operators in the wake of the tsunami.

Her article identified a number of specific lessons about nuclear plant design, so in this post I thought I’d look at whether more general lessons for high consequence system design could be inferred in turn from her list.

Continue Reading…

Planes and Trains

05/07/2011 — 1 Comment

I attended the annual Rail Safety conference for 2011 earlier in the year and one of the speakers was Group capt Alan Clements, the Director Defence Aviation Safety and Air Force Safety. His presentation was interesting in both where the ADO is going with their aviation safety management system as well as providing some historical perspective, and statistics.

Continue Reading...

Why more information does not automatically reduce risk

I recently re-read the article Risks and Riddles by Gregory Treverton on the difference between a puzzle and a mystery. Treverton’s thesis, taken up by Malcom Gladwell in Open Secrets, is that there is a significant difference between puzzles, in which the answer hinges on a known missing piece, and mysteries in which the answer is contingent upon information that may be ambiguous or even in conflict. Continue Reading…

Fukushima NPP March 17 (Image Source: )

There are few purely technical problems…

The Washington Post has discovered that concerns about the vulnerability of the Daiichi Fukushima plant to potential Tsunami events were brushed aside at a review of nuclear plant safety conducted in the aftermath of the Kobe earthquake. Yet at other plants the Japanese National Institute of Advanced Industrial Science and Technology (NISA) had directed the panel of engineers and geologists to consider tsunami events.

Continue Reading…

Why do safety critical organisations also fail to respond to sentinel events?

Continue Reading...

The HAL effect

09/09/2009 — 2 Comments

Do we automate our cultural biases, and can this have an affect upon the safe coordination of crew and automation?

Continue Reading...