How to Make Bounded Decisions on Risk (Part IV)


Groupthink and thinking about technology risk

Sometimes we are asked to make decisions as part of a group and in that social environment the group dynamics of interpersonal relations may lead members to value reaching a group consensus more than actually critically evaluating and testing ideas, often with disastrous consequences.

The technical term for this ‘consensus uber alles’ effect is Groupthink, which has been more formally defined by the political scientist Irving Janis as:

‘A mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members’ strivings for unanimity override their motivation to realistically appraise alternative courses of action.’

Irving Janis

Groupthink can be an insidious effect as conforming with, or influencing, the opinion of a group is an inherent part of most people’s makeup. As with other forms of human error it’s occurrence rate is related to the environment in which the task is performed, i.e. it’s much more likely to occur when the decision making is being made within a flawed organisation (1) and/or a challenging decision context (2). Put these two performance shaping factors together and you need to be very careful about how group decisions are made.

Janises Groupthink model (Image Source: Wiki Commons)

Unfortunately decisions on technological risk are highly conducive to the Groupthink effect as they often involve challenging ethical decisions about complex technological questions with an inherently high level of uncertainty and which almost inevitably have an ethical component to them. Such decisions are again predominantly made on behalf of society by small  groups of technocrats who tend to possess a highly homogenous world view.

As I see it there are three principle ways that Groupthink can bias such technology risk decisions. The first is that it may accentuate the confirmatory bias of group members in selecting and presenting information leading in turn to a failure to identify safety risks (hazards). The second is that identified safety risks will have the risk component down played or ignored to make it acceptable. And finally the group is less likely to develop contingency plans to deal with an accident should it occur, due to the combined effects of the first two factors.

So what to do about it?

The key point to note is that Groupthink is not an inevitable outcome, and alternative and more constructive ways to make decisions as a group can be fostered. As a case in point after the Bay of Pigs fiasco the Kennedy administration learned from their collective decision making failure and during the subsequent Cuban Missile Crisis sought to safeguard the decision making process against the effects of Groupthink.

To start with one should consider the environment in which the decision making is being made and whether it’s conducive to the appearance of Groupthink (see notes 1 & 2 below). If you then start to see the following behaviour patterns emerging in the group, be on your guard:

  • shouting down of contrarian views and shunning of members who hold such views,
  • the spiral of silence effect,
  • observing that critical thought is done ‘in meeting’ and rarely are issues taken away for reflection ‘outside’ the group,
  • where a directed verdict is given upfront by the sponsor or leadership,
  • observing that there is a suspiciously rapid convergence of opinion,
  • noting that certain team members spending inordinate time ‘defending’ a group position from criticism, and
  • finding little in the way of facts being provided, or the proffering of opinion as fact.

If you’re a manager and have tasked a team or group to provide you with an answer then be careful not to indirectly or overtly bias their decision before the fact. Try to spend less face time with the group to further reduce this effect.

If a group is split on a risk assessment ask them to develop the competing alternatives (best and worst case scenarios for example) and present a majority and minority report that will allow you to choose. When reviewing the report discuss and identify key uncertainties and assumptions, often times assessments of necessity are based on critical assumptions that may in turn have been unconsciously selected to support a particular outcome.

If you’re leading or involved in a group decision, rather than side-lining or marginalising those with contrarian views use them as a critical resource for independent review. Accept the valid elements of that review and leave the rest. Involve external parties in discussion of the groups work and if possible open up the risk assessment to an independent red team or murder board review.

If you’re a member of a group ask yourself and others in the group ‘Are we going to Abilene?’ (3) to check whether you actually all believe in the decision or are simply going along for the ride on the coattails of social conformity.

Another simple challenge question is to ask yourself and others ‘Are we drinking our own bath water?’ (4) as a check whether there is sufficient independent and convergent evidence to support the conclusions drawn.


1.  That is an environment that is isolated, lacking impartial leadership, overly homogeneous in background and/or ideology and lacking methodological norms.

2.  For example time pressure, previous failures, moral or ethical dilemmas, complexity of the problem etc.

3.  A form of Groupthink, where members of a group decide on a collective course of action that individually none of them prefer or support, sometimes termed the Abilene Paradox.

4.  A term that has evolved, to describe a person or group who have lost perspective and started to believe their own hype.