More open sources of data and open, common modelling standards will facilitate climate-transition decisions

I’ve spent my entire career trying to develop and improve catastrophe and weather-related models and analytics. So from a personal perspective, if there’s one thought amongst many others that I would like to get across, it’s a sense of the opportunity to enhance decisions linked to climate risk mitigation and resilience if we use our ever-improving scientific understanding and modelling capabilities wisely.

There’s one thing that holds an important key to unlock that opportunity – openness. I don’t mean a willingness from governments and others to set out and stick to commitments to reduce emissions – although that would be rather helpful, of course. I’m referring to two other things.

More open-access data

The first is more open-access data. Open-access and curated data can support and encourage improved climate-based risk assessment and decision making. The public and private sectors should be working together to increase the world’s resilience to current and future climate risk.

It is essential that those in the climate analytics community come together, wherever they are, to promote the win-win opportunity that comes from the development of publicly accessible, open and curated core data, and risk models.

The goal should be a global, consistent source of ‘real’ assets data, such as locations of industrial plants and energy transmission facilities, or natural resources. This can be used for both public good and to enable organisations such as WTW to augment the data for commercial applications.

Examples of good work already being done in this area include the Spatial Finance Initiative’s global asset GeoAsset Data Project, the Insurance Development Forum’s Global Risk Modelling Alliance, and the Nasdaq-designed - and now publicly-curated - Open Exposure Database.

As a further indicator of the benefits that this public/private approach can offer, WTW has been working in partnership with the National University of Singapore, flood modellers JBA and open-source public data of real assets to improve understanding of future flood risks in South East Asia.

Common standards for common understanding

My second take on the need for openness, which is linked to the need for better and open models and data, relates to model and data standards and metrics. Right now, there are lots of companies and organisations producing climate hazard models. And as climate science is so fast moving, they are all making lots of different choices and assumptions for those models.

If a multinational business is trying to work out an international climate transition strategy, it almost certainly won’t be comparing the same things from region to region. But the last thing we want in attempting to unify action across the globe to reduce emissions is a VHS versus Betamax situation.

What’s needed is agreement on an open-standards framework for the capture of key climate conditioned modelling parameters. These should support consistent representation and use of forward-looking models developed by independent modelling organisations in order to give confidence in the comparison and integration of models across various hazards and regions.

Similar things have been achieved before, but they need someone to take the lead. Nasdaq did it in relation to the catastrophe model exposure data, OED, which is now used as a common format for a wide range of models from different vendors.

We’re all in this together

Openness is a subset of collaboration. And if I hear one word used more than any other to describe what’s needed to combat climate change, it’s collaboration.

Let’s make sure that it’s applied in practice to open source data assets and standards that can give us all better and more consistent knowledge about the impact of different future climate transition scenarios.

Matt Foote is senior director, Climate and Resilience Hub, WTW