It is vital that risk managers understand how people think; how ‘head logic’ and gut instinct can work against each other. Nathan Skinner delves into the psychology of risk
Understanding how people make decisions is an important part of risk management. Risk managers are often required to convince people to think differently about risks. The problem is that human behaviour is complex, sometimes unfathomably so. People regularly misinterpret risks and sometimes they interpret them correctly but take huge gambles anyway, usually because for them the rewards are worth it. Not only does this depend on an individual's personality - are they a risk taker or not? - but also on their interpretation of the external environment. One person's idea of a reckless risk not worth taking is another's idea of fun. I'd never consider single handedly trekking to the North Pole, for instance. Whereas for someone else, like a thrill seeking explorer with a penchant for polar bears and 24 hour daylight, that would be a reasonable undertaking. It all depends on where you're coming from.
A deeper understanding of what drives behaviour is particularly useful in the insurance industry. That's why Lloyd's of London, the insurance market, has produced a report recently looking into how subconscious thought can cloud people's judgements. They hope with a better understanding of these issues their underwriters will be able to make better business decisions. “Behavioural theory is very relevant to insurers but many of the issues raised in our report are relevant to businesses of any sort,” says Trevor Maynard, head of emerging risks at Lloyd's, who was involved in the research. “It is all about people, individuals and groups and the kind of biases and cognitive issues that we all face, being at heart animals as well as thinking beings.”
He continues: “We came to the conclusion that behaviour is driven by perception of risk. There are some risks that people really fear or dread, and they will rate those higher than they really are. Things that look large, like earthquakes or asteroids people fear them irrationally even though there is only a slim chance of them happening.”
The psychology of risk
It does not come without its critics but Dan Gardner's book “Risk: The Science and Politics of Fear” provides a helpful illustration of how the mind works as well as why people jump to some of these irrational decisions. His main point is that the when it comes to risk we are of two minds. Drawing on the work of several leading psychologists, sociologists, and behavioural scientists, including Paul Slovic, Gardner explains how the brain is split into two competing systems, which he calls “Head” and “Gut”. Head describes logical reasoning that we apply to decision making. Gut refers to the inbuilt animal instinct that we all possess. Head's reasoning improves over time as we learn. Gut instinct is much stronger because it embedded millions of years ago and its roots go back to our early ancestors who evolved as hunter gatherers on the plains of Africa. Obviously one of these mindsets is useful for the modern 24 hour media saturated world that we live in today. The other proved far more useful in the harsh Africa of the Australopithecines.
Don't forget that in the entire history of the human species modern day civilisation, say the last century, is a tiny microsecond (scientists believe we branched off from apes between five and seven million years ago). Even though Gut instincts are still very useful, in teaching us to recoil from pain and vicious animals, in a busy modern world where we have to deal with multiple complex messages it sometimes leads to confusion. In most cases Head's logical reasoning is able to balance out Gut's knee-jerk reaction. But sometimes the emotion is too strong.
Research shows that dread risks are among the most powerful influencers, for example risks that people fear, such as terrorism, cancer, nuclear power, are given a very high priority by Gut. Culture and in particular the media play a very powerful role. For instance, opposition to nuclear power is much stronger in Britain and the US compared to France, where nuclear power is more widely used. Despite the fact that modern nuclear power stations are very safe, the public, at least in the US and Britain, is still haunted by the image of the Chernobyl disaster and the overriding culture, reinforced by the media, is anti nuclear.
Another example which helps explain how Gut still exerts significant influence over our behaviour, is people's response to the September 11 terrorist atrocities. Gut evolved in a way that if we saw something with our own eyes or we could imagine it happening, it was probably more likely to happen. Unfortunately, we live in a TV and media saturated world, we here horror stories on the news daily and sometimes we are not good at distancing ourselves. In the case of September 11, research has shown that the blanket media coverage of the terror attacks led to a decline in air travel in America, because people were worried that their plane would get hijacked. This encouraged people to take to their cars and the number of fatalities on American roads after September 11 soared. In fact research showed that 1,595 more people died as a result of fear of flying post September 11. That is more than half the number of people who actually died in the terror attacks.
David Hancock, group risk manager for London Underground, is a strong advocate of behavioural risk management. He has worked on many large scale projects and partnerships, including, when he headed risk at infrastructure firm Halcrow, on Heathrow's Terminal 5, where he has used techniques to engage with people and change the way they approach risk.
“Behaviour is the hardest thing to change,” he says. “It is about hard work and education. You have to take time to sit down and explain to people how we can make this better. But that's the hard part.” It is very difficult to explain to a business owner that what they're doing is wrong, he says. “Understanding behaviour is about understanding people.”
Maynard agrees: “Changing behaviour in the organisation is very difficult. An example has to be set by senior management. If they rate something highly themselves then the rest of the organisation is more likely to do so. The converse of that is that if the actions of the senior team do not back up what you're saying then you will not succeed in changing the culture.”
Both agree that personality is a key factor in people's ability to judge risks. Venturesomeness has been identified as a personality trait that leads to a reduced perception of risk, which could explain why hardy explorers are able to boldly go where others dare not. For most people the less familiar they are with something the more they rate it as a risk. “There's a flip side to that,” adds Maynard. “You should remain careful about things you have become familiar with. If you become over familiar with something you can stop regarding it as a risk and that can be quite dangerous.” Knowing what type of person they are dealing with is very relevant for risk managers particularly if they are trying to convince people to treat risks differently.
Emotion is another strong driver of risk perception, worth remembering when communicating risk. Educating people about risks is very important and it can help people adopt the right attitude, but trying to convince them with statistics alone won’t work. “You would think people would listen to the facts and they would agree but the reality is images and emotive language are much more important,” says Maynard. “You can use scenarios to challenge established views aobut risk,” says Hancock. They are a useful way of putting events into context and making them seem real. He often demonstrates risks using real life examples to get his point across.
People will change their behaviour depending on how they feel about a risk. If they feel comfortable that it won't impact them then they'll probably adopt a more relaxed approach to it. If a person knows a risk is insured, in other words the consequences of it going wrong will land on someone else's lap they tend to treat it differently. “Insurers call this moral hazard,” adds Maynard. “It is something we are very aware of and we try and design policies which encourage the other party to stay engaged with risk management, like deductibles.”
If the anticipated benefits of taking a risk are high some people are more likely to rank it lower and maybe push ahead, either through self-interest or foolhardiness, even if evidence exists to the contrary. This can make the risk manager’s job harder, particularly if they are cautious about a particular strategy or investment. Faced with this risk managers can actively seek non-conforming evidence to try and put forward a more objective equilibrium, but they need to be convincing.
Gender, ethnic background, religious viewpoint and social class have all been shown to affect risk perception. Diversity in the workplace and actively encouraging alternative opinions can help mitigate some of these potential biases. Rotating people on and off committees is a good way to seek fresh ideas and approaches, says Maynard, who also advocates the Six Thinking Hats technique, an approach well trialled by consultancies that purports to help groups think together more effectively.
What does this mean for risk managers in their efforts to weigh, judge and sometimes reign in the behaviour of people within their organisations? An awareness that perception of risk drives behaviour is a good place to start. After that it's a question of understanding what influences people's perception of risk and recognising that biases exist. Perhaps if risk managers understand what affects people's behaviour, what makes them do certain things or take certain risks, then they will be able to give some meaningful advice. Trying to enforce rules with no explanation or reasoning could easily backfire because if people don't understand why they are being told not to do something they will do it anyway. But with some careful explaining, risk managers may be able to provide some useful advice that will help organisations and their people approach risk differently or deal with the consequences more effectively.