Fear, uncertainty and doubt: it sells papers and boosts viewing figures, but what are the consequences of a misperception of risk?

It’s a scary world out there: muggers at the bus stop, terrorists on the Underground. And that person coughing in accounts, do they have swine flu? It’s all worrying stuff. But just how dangerous is everyday life really?

We live in a society high on fear but with almost no understanding of risk. Stirred up by a sensationalist media that tweaks deep-seated human uncertainties, many people respond in a very irrational way to their fears. And that can put business in a very tricky position, especially those professionals whose jobs it is to assess risk and how to deal with it.

“Most people just don’t react rationally to facts,” says British Airways head of risk Philip Osmond. “For example, when there is news of an airliner crash, this can enhance fears of air travel, despite in reality people being far, far safer in the air than in their cars.”

And recent events have shown just how lethally dangerous an irrational response to fear can be.

After watching the horror of the 9/11 terrorist attacks unfold on television, many Americans sharply reduced their air travel. This led to increased car use, meaning increased road traffic accidents. As a result, an estimated additional 1,500 people died on US roads in the attempt to avoid the fate of the passengers who were killed in a devastating – but so far unique – attack, according to a study published in 2006.

“I don’t know anyone who is frightened of getting in a car, but it’s actually bloody dangerous,” Osmond says. “Yet there is this common feeling that flying is unnatural; that hurtling along in an airborne metal tube at 38,000 feet and 600 miles per hour doesn’t feel right. In the same way, hurtling along at 80 miles per hour with controlled explosions going on under the bonnet doesn’t make much sense either. But people just don’t think about that.”

The effects of risk avoidance

Irrational fear is particularly a problem in situations where the consequence of a negative outcome is so terrible that people are reluctant to tolerate any level of risk whatsoever.

A good example is the MMR scare in the UK. In 1998, Dr Andrew Wakefield published a study in The Lancet suggesting a link between autism and the single vaccination for measles, mumps and rubella. This was widely reported in the media and, as a result, vaccination rates plummeted, which led to a rise in measles. Despite the study being subsequently utterly refuted – the General Medical Council has ruled he had acted “dishonestly and irresponsibly” in doing his research and even Dr Wakefield admits his claims were “unfounded and unjust” – many people are still unwilling to give their children the vaccination.

“People are just not willing to take a risk when the consequences – in this case their child becoming autistic – are too awful for them to contemplate,” says head of risk for London Underground, David Hancock.

In circumstances like this, companies can find themselves in a situation where they have to be seen to respond to public fears, even if rational analysis would perhaps demand a different reaction. “We see this all the time on the railways,” Hancock says. “Statistically, train travel is very, very safe, but because of the fatalities that have occured in the rare instances of a crash, we endeavour to make the public aware of the huge amount of money we spend on safety. Perception is everything. Risks don’t actually exist – hazards exist – what risk managers do is just future gazing. In that sense, perception is actually the ‘truth’.”

And it isn’t just in response to life-threatening events that risk perception can distort reality and shape the future – it also influences how employees work in more subtle ways. “Concepts of risk can be self-fulfilling,” Hancock says. “If you imagine that you won’t manage a project well, the likelihood is, you won’t.”

Additionally, going on a gut reaction rather than facts can also cause employees to downplay real risks. Factory floor workers may not be as worried as they should be about safety, because they’re constantly exposed to a risk and become complacent, or they are influenced by media stories portraying health and safety as a waste of time.

The picture is further complicated because many companies depend on a degree of risk as a way of gaining a competitive advantage. “Often you have to break away from the herd, and that always involves taking risks,” Hancock says. “Certain industries, such as banking, depend on it, and that has to be managed.”

Other sectors, like education, face the opposite problem: they are expected to operate on something as close as possible to zero risk, despite this being almost impossible to achieve. “We see bizarre situations, like parents bringing their children to school on a motorbike and then expecting teachers to keep them safe indoors and not even allow them to play outdoors,” Hancock says.

Learning to live with risk

It would seem that, across the board, human beings are predisposed to irrational reactions to risk, placing many companies in an awkward situation. But there may be a straightforward solution.

“If you read a typical psychological explanation of why this happens, it will argue that there is something wrong with people’s brains,” says Professor Gerd Gigerenzer, director of the Max Planck Institute for Human Development and the Harding Centre for Risk Literacy. “But I don’t think this is true. There is now a whole industry in the form of the media that has set out to deliberately distort and create misunderstanding. We have to educate people to deal with this. That is the key to helping them understand risk.”

According to Gigerenzer, the way information – particularly statistics – is presented is often deliberately confusing. For example, the results of a particular survey may reveal that the risk of death or injury associated with a particular activity has risen from 1:1,000 to 2:1,000 – a statement of absolute risk. Not so dramatic perhaps. But phrased another way – in terms of relative risk – there was a 100% increase. That’s the way to grab headlines and shift your agenda or product from a scientific journal to the mainstream.

“We live in a society where we have the right to clean water, but not clean information,” Gigerenzer says. “There is an example of a study into the health risks associated with third-generation contraceptive pills that was skewed and created huge fear. Women

in the UK stopped using it and there were around 14,000 more abortions that year. We cannot trust our media and there is no society without trust. We need to teach people how to be more risk literate.”

Hancock agrees that education is key to helping people better understand and think through fear. “The problem is partly that people are reacting emotionally,” he says, “but more importantly they are reacting in this way because they don’t have the tools to make an informed decision. We need to change this. Risk is seen as a controlling thing by both businesses and individuals, but I think a proper understanding of it can be very empowering.”

Gigerenzer agrees. “We need to educate a new generation to deal with the modern world,” he says. “We need to teach them how to live with uncertainties and not look for certainties were they don’t exist. There will be more surprises this century, and people need to know what questions to ask – whatever comes along.” SR

Top 10 factors affecting risk perception

1. The media

The media can increase the sense of threat, and decide what we should be worrying about. Foreign criminals, teenage gangs, and avian flu are treated differently in different news outlets.

2. How risk is explained

The statistical tools used to explain data in scientific journals can influence how it is interpreted, and how the public and media react to it.

3. Personal experience

If an individual has had negative experiences, they are much more likely to expect those things to happen to them again.

4. Entertainment

The success of particular films – like disaster movies – can influence how people perceive the risk of certain activities, such as air travel.

5. How you see the world

How you perceive risk is shaped by your views. For example, a left-wing person is unlikely to view industrial action as a ‘risk’ in the same way as a more right-wing person. A success-driven person will be more afraid of failure than someone more laid-back.

6. Familiarity of the risk

The more often you safely experience a ‘risky’ activity, the less likely you are to fear it. To the uninitiated, rock-climbing may look dangerous, but it will seem much less so to the seasoned climber.

7. Necessity of the risk

If calculated risk is part of your day-to-day world – as it is for a trapeze artist or a stock market trader – you are much less likely to be frightened of the consequences.

8. Recent events

A commuter stepping onto the Underground the day after a terrorist attack will probably see another attack as more likely than they did the day before the bombing.

9. Education

The better able an individual is to analyse a situation, the more rational their response to any risk will be.

10. The scale of the risk

Individuals are more likely to overreact to risks – however unlikely – if the consequences of a negative outcome are unbearable to them.

The millennium bug that didn't bite

At the close of 31 December 1999, our world was due to end. As the clocks struck midnight, the great silicon meltdown would begin as computers across the world, unable to cope with the new date, flickered and went quiet, taking with them our civilisation’s life support systems: health, medicine, power, water, transport.

We all knew that the ‘Millennium Bug’ had the power to do this because we had read endless articles about it in the run-up to New Year’s Eve 1999 – and not just the sensationalist tabloids, but in the pages of the serious press too. Yet, despite all the warnings, when dawn broke on 1 January 2000 it revealed a world with a bit of a hangover for sure, but otherwise pretty much fine.

The Millennium Bug is one of the great irrational risk reactions of our time. From a single paragraph in a 1993 issue of the Canadian Financial Post, headlined ‘Turn of century poses a computer problem’, to a 1999 headline in London’s Evening Standard – ‘Life-saving hospital equipment and 999 services in London face total breakdown on January 1’ – the media stirred a self-perpetuating storm of hysteria.

The first sources were computer scientists who thought there might be a problem although – crucially – had no idea how serious it would be. But pretty soon the real risk posed by the Bug was lost as business, IT and the media abandoned the facts and hyped up the fear. In the end, countless hundreds of millions of dollars were wasted preparing for a disaster that was never actually going to happen.

Nuclear safety scale

01: Anomoly: variation from permitted procedures

02: Incident: incident with potential safety consequences on-site. Insignificant release of radioactivity off-site.

03: Serious incident: very small release; public prescribed limits. Local protective measures unlikely.

04: Accident without significant off-site risks: minor release; public exposure of the order of prescribed limits. Local protection measures unlikely except for some foods.

05: Accident with off-site risks: limited release of radioactivity. Partial implementation of local countermeasures.

06: Serious accident: significant release of radioactivity. Full implementation of local countermeasures.

07: Major accident: major release of radioactivity. Widespread health and environment effects.