Archives For Risk

IMG_3374.JPG

In case anyone missed it the Ebola outbreak in Africa has proceeded into the explosive growth phase of the classic logistics curve, see this article from New Scientist for more details.

Here in the west we get all the rhetoric about Islamic State as an existential threat but little to nothing about the big E, even though this epidemic will undoubtedly kill more people than that bunch of crazies ever will. Ebola doesn’t hate us for who we are, but it’ll damn well kill a lot of people regardless.

Another worrying thought is that the more cases, the more generations of the disease clock over and the more chance there is for a much worse variant to emerge that’s got global legs. We’ve been gruesomely lucky to date that Ebola is so nasty, because it tends too burn out before going to far, but that can change ver quickly. This is a small world, and what happens inside a village in West Africa actually matters to people in London, Paris, Sydney or Moscow. Were I PM that’s where I’d be sending our defence force, not back into the cauldron of the Middle East…

Monument to the conquerors of space Moscow (Copyright)

Engineers as the agents of evolution

Continue Reading…

The igloo of uncertainty (Image source: UNEP 2010)

Ethics, uncertainty and decision making

The name of the model made me smile, but this article The Ethics of Uncertainty by TannertElvers and Jandrig argues that where uncertainty exists research should be considered as part of an ethical approach to managing risk.

Continue Reading…

From Les Hatton, here’s how, in four easy steps:

  1. Insist on using R = F x C in your assessment. This will panic HR (People go into HR to avoid nasty things like multiplication.)
  2. Put “end of universe” as risk number 1 (Rationale: R = F x C. Since the end of the universe has an infinite consequence C, then no matter how small the frequency F, the Risk is also infinite)
  3. Ignore all other risks as insignificant
  4. Wait for call from HR…

A humorous note, amongst many, in an excellent presentation on the fell effect that bureaucracies can have upon the development of safety critical systems. I would add my own small corollary that when you see warning notes on microwaves and hot water services the risk assessment lunatics have taken over the asylum…

In June of 2011 the Australian Safety Critical Systems Association (ASCSA) published a short discussion paper on what they believed to be the philosophical principles necessary to successfully guide the development of a safety critical system. The paper identified eight management and eight technical principles, but do these principles do justice to the purported purpose of the paper?

Continue Reading…

I’ve recently been reading John Downer on what he terms the Myth of Mechanical Objectivity. To summarise John’s argument he points out that once the risk of an extreme event has been ‘formally’ assessed as being so low as to be acceptable it becomes very hard for society and it’s institutions to justify preparing for it (Downer 2011).

Continue Reading…

Why something as simple as control stick design can break an aircrew’s situational awareness

One of the less often considered aspects of situational awareness in the cockpit is the element of knowing what the ‘guy in the other seat is doing’. This is a particularly important part of cockpit error management because without a shared understanding of what someone is doing it’s kind of difficult to detect errors.

Continue Reading…