The insurance industry is continuing its efforts to agree a standard for catastrophe exposure data. By Puneet Bhara

The 2005 year was a reawakening when it came to catastrophe modelling and data, after the losses suffered by the victims of the US hurricanes, and the insurance industry. Insurers and reinsurers complained that their catastrophe models had not prepared them for the scale of the claims they received, and they very quickly cast an accusing eye at the models they had relied on.

With hindsight, there was a more reasoned reaction; perhaps, the models themselves were not solely to blame for letting the industry down. At fault was also the underlying exposure data that was being fed into the models. Garbage in, garbage out. A key challenge then, was how to improve the exposure data that goes into the models.

Whose problem is it?

It is possible to say glibly that the problem is for the insurance industry to resolve. Although that is true, it is worth looking at the issue from all angles.

From a commercial lines perspective, we sometimes forget that the risk managers are the masters of risk in their company. They do not use only insurance to manage risk. They decide how much risk to keep and to transfer, and they use multiple tools for transferring it. They need quality data and effective models as much as all the other constituents of the risk management chain. Poor quality exposure data leads to inappropriate levels of cover and introduces the risk of coverage disputes in the event of a loss.

From a personal lines perspective, the data is aggregated by the agent, managing general agent (MGA) or broker, and their role in providing quality data is crucial.

Without standardised exposure data, brokers are unable to market their risks to a wide array of carriers; quotations are slower and coverages less reliable. This leads to their clients suffering a diminished service.

Without quality exposure data, risk carriers are unable to manage their exposures appropriately in terms of setting aside reserves and planning outward reinsurance and investment allocations. In an increasingly regulated environment, with regimes like Solvency II, Basel II and Sarbanes-Oxley, these risks need to be managed better.

Each constituent passes data through the insurance value chain, and with each step, another layer of inefficiency and inaccuracy is added to the picture. In short, the data degrades, and the problem worsens each time the data moves. Every constituent in the value chain has a stake in that data. Each constituent needs to be able to add value to the data without taking anything away or diminishing consistency or accuracy. So whose problem is it? It is an industry problem because no one constituent can fix it alone. It is a problem that requires standardised data and an industry solution.

A healthy insurance industry

With accurate data, reinsurers will be able to quote and provide terms which are appropriate for the risks being underwritten. “We work in an industry of averages and probabilities.

Whenever we can be more precise, we need to strive to exploit that ability. Standardising catastrophe information improves our modeling capabilities and expected loss estimates, thereby improving pricing and aggregate management strategies. As information becomes more transportable, we become more efficient as an industry. This in turn helps the individual companies, the policyholders and ultimately society in general,” according to Rich Ruggiano, vice president, DBG Reinsurance, AIG.

While the health of the industry as a whole is not necessarily a key target for buyers and sellers of insurance, there are emerging opportunities of strategic importance to all of the players in the value chain. Capital risk transfer, the selling of risk as tradable commodities and bonds on the open financial markets, is a growing area of interest.

Risk trading traded depends on high quality data — exposure data. Frankly, if the insurance industry does not capitalise on the opportunity, the banking sector will. If you had not noticed, we are already going in this direction with cat bonds. Already, we are moving away from risk mitigation (from catastrophic losses) to insurance-linked securities. If catastrophic losses do not get the CEO’s attention, risk transfer to the capital markets will.

No matter what, the end goal remains the same: a healthy insurance industry, able to take risk and serve its customers. Time is of the essence. While fraught with catastrophes worldwide, 2007 did not have a significant impact on the insurance industry. Another year or two like this, and catastrophe exposure data will again be relegated to a lower priority as people will revert to their 100 year models and other comfort factors. Now is the time to implement the standards and prepare for whatever may come.

A call to action

ACORD is a non-profit association which has been promoting the development and use of standards for the insurance, reinsurance and related financial services industries since 1970. From early 2007, ACORD received numerous requests to develop an industry standard for catastrophe data. From Bermuda and New York to London, Zurich and Munich – and from insurers, reinsurers and cat modellers to risk managers and brokers – there is clear demand.

Of course, this is not a new problem, nor is it the first time that the request has come to ACORD. In 2003, ACORD developed and released a catastrophic exposures reporting standard that was a comprehensive piece of work and very well detailed. However, this strength was also its weakness; the standard looked too comprehensive and too ambitious, and before effort was invested to implement it, priorities fell back to business acquisition, and fears of disasters once again fell away.

Today, things are quite different. There are approximately 60 participants on the ACORD working group dealing with catastrophe data, representing 40 companies from across the insurance chain and from around the world, all working towards an international, industry standard for capturing and sharing exposure data. “Catastrophe risks, like natural and manmade disasters or terrorist attacks, pose a major threat to businesses and insurance companies.

The increasing complexity of risks and coverages require ever more sophisticated modeling techniques,” stated Markus Aichinger, senior accumulation risk manager, group risk division, Allianz, adding that “A global exposure data standard would serve the whole industry by fostering modelling capabilities, increasing process efficiency and accuracy in risk evaluation. Allianz is fully committed to support ACORD to develop and implement global exposure data standards to reduce uncertainty in the risks we face day to day.”

Peter Arbenz, managing director, head of reinsurance information management for Swiss Re is similarly an advocate. “Exposure data plays a crucial role in the areas of cover pricing, underwriting and risk management … Swiss Re is fully committed to working with ACORD and its peers in the industry to agree and implement global standards that bring transparency and certainty to the risks we cede and underwrite.”

The group continues to grow and welcomes new members to help in the initiative. In addition to all their efforts to develop the standard, work is underway across the industry to get it implemented as well to which it currently has a three-pronged approach.

Standards development is the first part, where the group, chaired by David Keeton of Willis Re in the United States, is working to develop the ACORD data standard for sharing exposure data. By April 2008, the group expects to agree to the financial details of the standard and then develop the XML message. A guiding principle of the work is ensuring that the standard is achievable and takes into account the data from its initial capture at the first submission through to additions at subsequent steps of the process by agents, brokers, cedents and reinsurers.

Pilots are the second aspect. Here the group is testing the standard against business requirements and for feasibility in real business scenarios. Markus Speuhler from Swiss Re in Switzerland chairs this portion of the working group. The market is already whetting the industry’s appetites and increasing its willingness to provide location information in a standardized manner.

At the same time, another pilot is being evaluated to develop an industry utility that will assist professional re/insurance buyers in producing a standard exposure dataset. In theory, risk managers, brokers or cedents could upload their exposure data to a free-to-use web-based utility, map their data to the ACORD standard and produce an ACORD XML file which they can then forward to their brokers and re/insurance underwriters or, conceivably, capital markets.

The third prong of the approach is advocacy and marketing. This group, chaired by Paul Nunn of Lloyd’s, is working to ensure that the industry is aware of the standard and encouraged to implement it.

It is not co-incidence that the chairs of the ACORD working group represent the North American, London and Continental European markets as well as the broking and underwriting constituents of the industry. This is truly a cross-border, cross-industry initiative.

What next? Catastrophe exposure standards are unique. So many constituents from along the value chain have a stake in this initiative’s success. By developing, agreeing and implementing exposure standards, everybody wins.

The issue is timing. We know 2005 reawakened this sleeping giant but we do not know how long it will stir. A few more years of relative calm and it could be back in hibernation, until the next time a catastrophe surprises the industry.

The overall success of the standard depends on everyone this data touches. It takes input, participation and discussion. The working group and its leaders are open to more assistance and opinions as well as pilot participation. There is no limit to how you can use the standard, so the group welcomes ideas for innovative applications.