When a crisis happens, 'group think' and the 'denial curve'...

When a crisis happens, 'group think' and the 'denial curve' can make matters go from bad to worse. David Davies describes the psychological processes that affect crisis teams and how to combat them

No matter how many times it occurs, it is always cause for surprise when big, respected organisations make serious errors in the way that they handle a major crisis. Those errors are often so serious that in some cases they have severely damaged reputation, caused loss of life or have inflicted economic damage at a national level. Examples that spring immediately to mind include product contamination and product safety (Perrier, Coca Cola Belgium, Mercedes 'moose test'), serious contract over-run (QE2 refit), loss of life (NASA mission failures caused by O ring and foam damage), government reaction to epidemics (UK: BSE and foot and mouth; China: initial response to the SARS virus) and major scandals (Andersen, paedophile priests). These are just a handful of examples; the list could be expanded to fill several pages with no difficulty.

It gets interesting when we move the camera in more closely and look at the performance of the people entrusted to manage the crisis. In all of these cases, and hundreds more, the pattern of error is very similar, yet the circumstances, the type of organisation and the countries involved are very different. What we are observing arises from the dynamics and pressures that are unique to crises. So what do we know about this strange environment, and why is it so different to the normal world in which the same people presumably perform with less catastrophic consequences?

We should start by defining a crisis. In this context, a crisis is an unplanned (but not necessarily unexpected) event that calls for real-time high level strategic decisions in circumstances where making the wrong decisions, or not responding quickly or proactively enough, could seriously harm the organisation. Because of the type of issues involved and the penalties of failure, the crisis response group, typically assembled ad hoc, usually includes board members, plus relevant technical specialists. A catastrophe that merely triggers a technical response, such as an IT resumption plan or a business continuity plan, is not a crisis.

A crisis, in this context, is characterised by:

  • the need to make major strategic decisions based on incomplete and/or unreliable information
  • massive time pressures
  • often, intense interest by the media, analysts, regulators and others
  • often, allegations by the media and others that have to be countered.

    Handling a crisis with these characteristics is usually completely outside the experience of those involved.

    In addition, there are two other characteristics that are common to the way in which crises are handled. The first, known as 'group think', was first linked to crisis handling in the enquiry into the NASA O ring disaster. The engineers in the decision-making group were pressured by the commercial majority into agreeing to a launch that they felt could be unsafe. Group think has been defined as 'the tendency for members of a cohesive group to reach decisions without weighing all the facts, especially those contradicting the majority opinion'.

    The circumstances conducive to group think are:

  • a cohesive group
  • isolation of the group from outside influences
  • no systematic procedures for considering the pros and cons of different courses of action.
  • a directive leader who explicitly favours a particular course of action
  • high stress.

    The unfortunate consequences of group think, present in the handling of so many crises, are:

  • incomplete survey of the group's objectives and alternative courses of action
  • not examining the risks of the preferred choice
  • poor/incomplete search for relevant information
  • selective bias in processing information at hand
  • not reappraising rejected alternatives
  • not developing contingency plans for the failure of actions agreed by the group

    These outcomes are, indeed, common to the way in which many crises are handled. The only factor that is difficult to judge from the outside is the leadership. Did the crisis involve a strong leadership which had a need to reach a consensus that the scenario was much closer to best case than to worst -thus avoiding the need for a costly recall, or mission abort, or admission of fraudulent or unethical practices? It is only because of public enquiries in a small number of cases that the dynamics within a crisis group have been revealed at all.

    The central feature of most crises is that at the outset very little information is available, and even that may be unreliable. To quote one crisis team member, "We started with very few facts - and most of those turned out to be wrong". In most crises, there are a few fundamental questions to which answers are essential if the crisis is to be handled, but which are invariably left unaddressed in the knee-jerk reactions of a traumatised crisis team:

  • CAUSE: What is the cause of the crisis? Often, even this basic factor is unknown. Is the contamination deliberate or accidental? Is it the work of an extortionist? Are the allegations against us true, exaggerated, or totally false?
  • BREADTH: How widespread could it be? Does it involve one batch or the global product, one farm, a county, or the entire country?
  • REPEATABILITY: Could it happen again? Is it an isolated incident, or it is deliberate and therefore vulnerable to repetition? Is there a design or systemic defect? Are we being targeted? Could it trigger a string of copycat attacks or hoaxes?
  • BLAME Are we/will we be seen to be at fault, and if so, how offensive or blameworthy will our actions be seen to have been? Are our stakeholder relationships and reputational capital strong enough for us to be forgiven?

    Despite initially having no reliable answers to these questions, action has to be taken; the media may be making allegations and demanding answers, and there simply is not the luxury of waiting until all becomes clear. It is at this point that the denial curve often takes over (See Chart 1).

    Denial curve
    In crisis after crisis, the common denominator is that the team handling the crisis initially plays it down. The four key questions above may never have been systematically addressed, but the best case answer to each question is likely to have been assumed, even if not actually expressed. At this stage the crisis handlers are merely interpreting the meagre information that they have, but they are doing it with the strong influence of trauma, denial and often group think into the bargain.

    Next, the team tends to ignore, or at least heavily discount, any new information that contradicts the optimistic view they first formed. By now the team will be communicating their optimism to the media, perhaps presenting alternative 'no fault' or 'not widespread' theories before they have been substantiated, for example: not bad maintenance but vandalism (Jarvis/Hatfield rail disaster); not contamination of the source but cleaning fluid spilt by a cleaner into just one production line (Perrier); no design fault but a fluke (Mercedes 'moose test').

    Finally, when the wealth of evidence is overwhelming, the crisis group swings into line in one of three ways. It may underplay the significance of the change of direction (Perrier). More commonly it introduces changes in a low key fashion. Or it grossly over-compensates (foot and mouth, BSE). It may not be a coincidence that over-compensation seems to take place most often where the government or other organisation concerned will not suffer financially for the over-compensation.

    Denial is the first stage of the grieving process, a process triggered by any traumatic event. (The other stages are anger, bargaining, depression and acceptance. Anger also features in some of the ways in which crises are badly handled -anger with the media, regulators, and even customers -the very groups of people with whom impeccable relations are needed during a crisis.)

    In the words of a consultant who helped one organisation recover from crisis, "For several days the senior management was incapable of making cogent decisions, because of the shock of seeing their colleagues killed or maimed and their business destroyed". While a reputation-threatening crisis may not be as traumatic as one that involves loss of life, trauma will still be present. With it comes denial and, in the early stages, the inability to act.

    The reputation damage from underplaying the crisis and avoiding responsibility in the early stages can be considerable. The organisation appears to be arrogant, uncaring, and unsympathetic. Where there is loss of life, relatives with their own emotional and psychological needs are given a bureaucratic response. They want sympathy, facts, expressions of regret and apologies, but often get a buttoned-up lawyer reading out a formal press statement. Where denial delays or minimises remedial action, the tragic consequences can be loss of life (both NASA mission failures) or massive costs. In the enquiry into the UK foot and mouth crisis, one scientist estimated that the three day delay in introducing a national ban resulted in the scale of the disease being between two and three times as great. The crisis cost the UK economy £8bn.

    Combating denial
    How can these unfortunate, very human, traits be avoided? No amount of crisis management planning will help. Indeed, in crisis simulations where realistic pressure is applied, plans are often ignored in the heat of the moment. Crisis management plans can be valuable but they do not address the human reaction to the type of crisis described here. Three things do, however, make a difference:

  • CRISIS EXPERIENCE: Experience of handling a full blown crisis can be invaluable. As such crises are comparatively rare, simulations can be used.
  • CHOICE OF LEADER: One of the salutary lessons of 9/11 is that some splendid leaders in a 'normal' situation lost their leadership qualities in the trauma of the event, whereas others not identified as natural leaders coped admirably. In the absence of a genuine crisis, only a crisis simulation will identify the best crisis leaders. It is important that the leader is able to take a holistic view, embracing reputation, corporate values and stakeholder imperatives, rather than the more blinkered view that might possibly be imposed by, for example, a finance director or a corporate lawyer.
  • STRUCTURED REAPPRAISAL: A process that imposes a structured reappraisal of the key questions (those outlined above, varied according to the nature of the crisis) each time new key information is received is important. Of course, such an approach will only succeed if the crisis group is encouraged, or led by example to seek unbiased decisions, and not allowed or encouraged to perpetuate their belief in a cause-severity-blame scenario that is commercially desirable, but less than likely.

    In addition to crisis simulations, what other steps can organisations take to prepare? Crisis management plans are very useful to take care of the logistics of a crisis, but very few organisations have crisis management plans of any quality (as opposed to business continuity plans). Also, crisis management plans tend to concentrate on crisis logistics, and not on the psychological behaviour of the crisis team.

    There is, however, a new type of process -Dynamic Crisis Management -that I believe responds to the denial syndrome. It applies particularly to those types of crisis (the majority), where the key facts surrounding the crisis are unknown at the outset. It supplements and compliments a crisis management plan, producing a modular approach, and, if there is no plan, makes a significant contribution in its absence to the performance of the crisis management team, particularly in overcoming the denial curve and group think. It is capable of being run even with no pre-planning, training or preparation -not a desirable situation but, regrettably, a realistic one. Chart 2 illustrates this modular approach to crisis management

    Initially, the process was developed as a paper-based collection of forms and instructions, so that it could be run without a computer. This initial version had the capacity to:

  • Support a brainstorming session that would establish or fine-tune the scenario alternatives for the four key questions outlined above (cause, breadth, repeatability, blame)
  • Provide a process for readdressing and monitoring the likelihood of the scenarios within the four key questions, each time new critical information was received
  • Track the allegations of the media, grouped if necessary (for example, the national quality press, international press, national tabloids), and the opinions of one or more stakeholder groups (customers, employees, shareholders), displaying these on a series of time lines so that movement in opinion could be tracked, extrapolated and pre-empted.

    It became evident, however, from observing the way in which crises are handled, that, in addition to the denial curve and group think, there was a third problem - poor record keeping. This was highlighted in the official enquiry into the 2001 UK foot and mouth crisis, which criticised the poor record keeping and confused accounts of events. 'While some policy decisions were recorded with commendable clarity, some of the most important ones taken during the outbreak were recorded in the most perfunctory way, and sometimes not at all.'1) Even in the trauma of simulated crises, basic management skills, such as delegation and orderly record keeping, can quickly disappear. Completely reliable information tends to become confused with the speculative. The reasons for decisions, the assumptions on which they are based, and the actions that they trigger, go unrecorded.

    This suggested that a process was needed to impose a structure on record keeping during the crisis, and to do this as unobtrusively and transparently as possible. Therefore the paper-based process of Dynamic Crisis Management was converted into a computerised version, incorporating automated record keeping. This allows information to be recorded in a structured way, and also provides a clear method for recalling, sorting and searching all of the information, unknown information, decisions, assumptions and, most importantly, the links between them.

    Pressurised teams lose the ability to view the complete picture, focusing only on what is most pressing. The ability to pull the camera back or to undertake a structured interrogation of all of the facts and issues can be invaluable.

    However, the key success factor is the behaviour of the crisis team leader. There can be advantages in using an experienced external facilitator who will have specific skills, as well as being distanced from both the emotions of the event and the hidden commercial and personal objectives that are so often imposed on the rest of the team.

    Whether imposed by the team leader or by a process, the choice of openness and realism over denial will depend upon the extent to which the crisis team accepts that denial will, in the long run, cause significant harm to reputation. This is clearly demonstrated by the list of casualties among organisations that tried the denial route. Had the techniques outlined become universal a decade ago, companies would have avoided huge losses in share and brand value, the UK would have avoided massive damage to its economy, and many lives would have been saved.

    1) Anderson report, 7/02

    David Davies is managing director of Davies Business Risk Consulting Ltd and head of the reputation risk and crisis management stream of idRisk, a network of independent risk consultants. E-mail: ddavies@dbrc.co.uk , www.dbrc.co.uk , www.idRisk.com