Modifying the behaviour of complex systems in such a way as to minimise risk and promote opportunity is a...

Donella Meadows, who died in 2001, was a systems analyst, a believer in sustainability and a popular columnist, best known as the author of The Limits to Growth. At a NAFTA meeting in the early nineties, horrified by the incipient creation of a complex system whose behaviour was merely being guessed at, she spontaneously came up with the ideas which were to become Places to Intervene in a System.

Less a recipe than a stimulus for thought, her essay identifies and ranks 12 leverage points where the behaviour of a complex system may be able to be altered; where a small change in one thing can produce big changes in everything. As Meadows points out, such leverage points are nothing new; they even reside deep in our favourite myths: the secret password or the magic ring which provide the key to solving a problem. Furthermore, says Meadows, people have an instinctive feel for where these leverage points lie, but are often to be found pushing at them in the wrong direction - for example believing that higher spending will solve a problem, where the opposite may be the case.

How is her thinking important to the risk manager? Firstly, the risk manager's raison d'etre is to modify the behaviour of complex systems in such a way as to minimise risk and promote opportunity. Secondly, it is likely to be at the leverage points - call them sensitive spots - where risks or opportunities congregate. Meadows' thinking may not solve the risk manager's problems directly, but it may provide a stimulus towards identifying where they are likely to lie. Here then is my gloss on Meadows' 12 places to intervene:

12. Constants, parameters and numbers

Interestingly, Meadows cites these as the least important of her intervention points, while recognising that perhaps 99% of our attention goes to adjusting them. 'Diddling with the details' she calls it. Her point is that although people care deeply about numbers and parameters, changing them very rarely changes behaviour, and a chronically stagnant system is unlikely to be changed simply by focusing on the numbers. Shaving marginal costs here and there, even setting boundaries, limits and targets can only get you so far. The system as a whole does not change.

11. The size of buffers and other stabilising stocks relative to their flow

On the one hand, a large buffer or reserve will give a system resilience and smooth out fluctuations. On the other it leads to inflexibility, inefficiency and is itself a possible danger point. A warehouse-full of product can shift from being a reassurance against supply-chain disruption to being a major risk if there is a sudden drop in demand. Equally, insufficient duplication of key skills within a critical function may suddenly become a problem in the face of an epidemic or other event leading to the loss of staff. Buffers are important, but they rank low on Meadows' scale because changing them is usually a slow, and often costly process.

10. The structure of material stocks and flows and nodes of intersection

Infrastructure, in a word. The infrastructure may be physical and entirely outside an organisation's control, such as the road network on which delivery depends, or less tangible but equally outside control, such as the demographics behind an ageing workforce. Tenth place on the list reflects the difficulty of intervening in this aspect of a system, simply because intervention so often requires expensive rebuilding from scratch. Hence the reason why a clunky IT system or London's ageing Underground may just have to be lived with. The key leverage point, as Meadows notes, is to get the design right in the first place.

9. The lengths of delays relative to the rate of system changes

Plenty of risks cluster around this, the most obvious of which follow on from the inability of a system to change as fast as those pushing the levers may wish it to. The lag in the catch-up time may have purely material aspects, such as the time taken for a new plant to be built, or it may be less tangible, such as the time taken for cultural changes in an organisation to become established. One crucial area for the risk manager concerns information flow. Is feedback about the state of the system reaching decision-makers fast enough for decisions to be based on accurate information? Or is it already out of date? Meadows cites this intervention point as an example of where people are to be found pushing levers in the wrong direction, by attempting to make the system respond to change faster, rather than by slowing the rate of change and allowing the system to catch up. Again, she says, intervention is not that easy, although it can have critical effects.

8. The strength of negative feedback loops, relative to the impacts they are trying to correct against

A crucial sensitive spot for the risk manager, since at this intervention point lie all those controls designed to prevent or manage risks which might otherwise spin into chaos. The emphasis lies on the second part of sentence, which effectively asks whether the controls are strong enough. An example would be a crisis management plan which existed, but was not kept up to date, or left too long untested. Relative to the impact a crisis might have on the business, the weakness of such a negative feedback loop becomes a critical point for intervention. There are many potential risks which can cluster at this intervention point, perhaps ranging from internal fraud (is the whistle-blowing policy effective?), to such matters as safety (is there an adequate safety culture?) or reputation risk. This, too, is the area where many governance regulations are designed to provide negative feedback loops. Are they properly implemented? Are they working?

7. The gain around driving positive feedback loops

This is the sensitive spot where intervening in a system can produce opportunities or benefits. There are also of course, positive feedback loops which can rapidly lead to disaster if left unchecked, and in mechanical systems, this is their most familiar aspect. The business equivalent might be an inept PR response to negative media comment - it makes things worse, and then the story is picked up more widely, and things become worse still, until what was once a cloud on the horizon becomes a full tornado. It is crucial to look at the many things which might spin out of control in this way and to identify the corresponding negative feedback which can slow them down before they reach crisis point.

But positive feedback loops also exist in a 'success to the successful' aspect, where an intervention brings benefits, which in turn bring more benefits. A successful health or rehabilitation policy may not simply be a negative feedback loop to counter spiralling absenteeism; it may become a positive feedback loop, where a healthier workforce leads to greater productivity, to improved morale, to better long-term staff retention, to better reputation with stakeholders. The same kind of gain could be sought in numerous areas, from accessing the creativity of personnel, to promoting good corporate social responsibility.

6. The structure of information flows

Meadows cites the example of identical houses, some of which had electric meters in the basement, some of which had them in the hallway. Electricity use in those houses where the meter was always visible was 30% lower than in the others. Is information being provided in the right way to the right people at the right time? Properly-used levers here can directly change behaviour - we might cite drink drive campaigns as an example.

Indeed, so powerful are the levers here, that we often see struggles arising around them on a political level. Look at the Freedom of Information Act, or the UK Government's proposed ID cards, to see how strongly people can feel about where and how information is provided. For the risk manager, successfully intervening in the system here is potentially one of the most productive and fastest ways of changing behaviour. But as always, there is the risk of pushing levers in the wrong direction and achieving changes directly opposite to those intended

5. The rules of the system

All systems have rules. Some of them, such as the second law of thermodynamics, are absolute. Others can be changed, with often startling effect. Two swift changes to the rules of the Soviet Union (perestroika and glasnost) and a whole system of government disintegrated. Should risk management seek to intervene here? It is arguable that what we think of as 'silo mentality' is an effect of unspoken but inflexible rules. Who has made them? Are they beneficial? If we seek to change them (perhaps by altering the structure of information flows) is the effect predictable? Rules can be changed from the outside too. Consider how public opinion is changing the rules about the importance of the bottom line. Making huge profits at the expense of people or the environment is no longer acceptable. Part of risk management is not only thinking about how rules may be changed, but also to look at how they may be changed from the outside.

4. The power to add, change, evolve, or self-organise system structure

The power of this point is best illustrated by quoting Meadows directly: 'Insistence on a single culture shuts down learning. Cuts back resilience. Any system, biological, economic or social that becomes so encrusted that it cannot self-evolve, a system that systematically scorns experimentation and wipes out the raw material of innovation, is doomed over the long term on this highly variable planet.' In business terms, I would paraphrase this by saying that any organisation which entirely loses its appetite for risk is doomed to extinction by its competitors. This intervention point directly challenges the creativity of the organisation, and this is where risk management has to ask itself whether it is contributing to a loss of the power to evolve or initiate, or whether it is stimulating it.

3. The goals of the system

Not low-level goals such as managing risk (say) within acceptable parameters, but what the whole system is about. What is the point and purpose of your organisation? What is it trying to achieve? Is it world domination or merely to keep going from day to day? What does the risk manager have to do with this kind of high level thinking? Someone has to think about the goals of the system (which are not the same as strategies or objectives), and it might as well be someone who understands what risks and opportunities the system holds. If you are a large pharmaceutical company, what would happen if you gave up the search for a continuing pipeline of low-level drugs and threw every last resource into seeking a cure for cancer? What would happen if your organisation decided to become the first in the world to give up using all petroleum products by a certain date? Trying to change the goals of the system will either lose you your job overnight - or it might change things for ever.

2. The mindset or paradigm out of which the system arises

What we think. What our ideas, beliefs and desires are, out of which we form our social and economic systems. Having an ever-higher standard of living. Believing that things can be owned. Thinking that growth is good. Why does Meadows put this so high on the list, when it may seem virtually impossible to intervene? Because, she argues, a paradigm shift can be cheap and swift, and unarguably powerful. There is something to be said for the idea that European society is undergoing a collective paradigm shift which Meadows did not live to witness, stimulated by ideas such as global warming, peak oil, reduction of biodiversity and excessive consumption of natural resources. If there is indeed such a paradigm shift, is your organisation going to survive it? How, as risk manager, do you respond?

1. The power to transcend paradigms

Enlightenment. The state of 'not knowing'. Knowing that there is no certainty in any worldview. The knowledge that if no paradigm is right you can choose whatever one will help to achieve your purpose. What does this have to do with risk management? Nothing. And everything.

A final caution

I shall quote Meadows at length, because I can put it no better:

'The higher the leverage point, the more the system will resist changing it - that's why societies tend to rub out enlightened beings. Magical leverage points are not easily accessible, even if we know where they are and which direction to push on them. There are no cheap tickets to mastery, whether that means rigorously analysing a system or rigorously casting off your own paradigms and throwing yourself into the humility of Not Knowing.'

Meadows' original essay can be found at:

www.sustainabilityinstitute.org/pubs/Leverage_Points.pdf
- Andrew Leslie is deputy editor, StrategicRISK.

Topics