Swiss bankers and fire ants: predicting extreme risk

This scientist’s leg was stung over 250 times in 10 seconds when he knelt on a fire ant nest. Image credit: US Department of Agriculture
Tuesday, 5 August, 2008

What do Swiss bankers and fire ants have in common?  Not much at first glance, but an unusual new study notes that the banking industry has learnt a lot from bitter experience with rogue traders, internal fraud, tsunamis, computer meltdowns and even incompetent CEOs .

As a result, they have a lot to teach quarantine and environmental authorities about trying to predict very rare but very costly real-life events, such as invasions by exotic diseases and pests like fire ants, says the study titled "Evaluating extreme risks in invasion ecology: learning from banking compliance", in the journal Diversity and Distributions.

Globalisation and new technologies have left the banks vulnerable to new and sometimes highly costly operational risks, says a research team led by Professor James Franklin, a UNSW mathematician, with colleagues from the Australian Centre of Excellence for Risk Analysis (ACERA) in Melbourne.

Despite using highly sophisticated computer programs, for example, to recognise abnormal patterns of transactions, they have still fallen prey to massive scams. Barings Bank infamously collapsed in 1995 after A$1.6 billion losses to rogue trader Nick Leeson. The National Australia Bank (NAB) lost about A$330 million in rogue trading in 2004.

Similarly, international trade agreements have exacerbated the risks of ecological damage due to the entry, establishment and spread of invasive species, which are likewise not only hard to prevent but fiendishly hard to predict.

Failing to do so, however, can be costly indeed. The discovery in 2001, for example, that stinging South American fire ants had somehow established themselves in Queensland prompted a massive control and eradication program over thousands of hectares of land; it has involved hundreds of people and so far cost almost $200 million.

Part of the problem in analysing and evaluating the probability of such extreme risks is that, by definition, they happen so rarely: on graphs they appear as "outliers",  so out of kilter with the rest of the data that they more likely represent a measurement error than a real problem.

"A bank's business is to take in funds and lend most of them out for profit while reserving some against risks," the paper says. "The regulator, on the contrary, wishes to ensure that the bank fully states its risks and reserves against them, so that the bank and the whole banking system remain stable."

In the past, resolving that conflict often came down to expert opinion, sometimes of questionable validity. To get around this, the Basel II compliance regime for banks has been developed: a package of mathematical and legally inspired methods from which other areas such as biosecurity can learn, the authors say.

Regulators and bankers meet with mediators to press their respective cases, assess the data and understand where expert opinion goes beyond the data. There has been progress in fraud detection, where the aim is to identify by automatic methods "unusual" or "fringe" data points in large data sets that warrant further investigation.

"Pattern recognition has been applied to such problems as money laundering detection and intrusion in computer systems, but far less to such potential areas as environmental monitoring, epidemic alerting, or quarantine inspection. Second, having made the most of automatic methods to delete outliers, one must consider one by one the most extreme remaining cases with a critical eye. A characteristic of extreme risk is the existence of individual data points whose relevance is itself a matter of dispute."

The compliance style works most naturally when there are definite events of very low probability to quantify, says Professor Franklin, such as the spread of fire blight from an infected imported apple or massive internal fraud in a bank.

"A basic problem with evaluating extreme risks - risks of catastrophe beyond the range of available data, like quarantine, terrorist, nuclear explosion risks - is that there is no relevant data, " according to Professor Franklin. "It hasn't happened yet, so how do you know the chance that it will? It is necessary to use expert opinion, but then how can you keep the opinion honest?

"The paper argues that compliance regimes in banking have made progress with this problem and their methods ought to be used more widely, for example in quarantine risk. The idea is to use expert teams in an adversarial situation like a court of law, joined with the statistical methods of Extreme Value Theory used in calculating once-in-a-hundred-year floods, which extrapolate beyond the range of data.

"In both cases there are existing stakeholders with opposite interests: a bank compliance regime keen for the bank to hold more reserves, versus a gung-ho bank lending division wanting to lend out the money; farmers keen to ban imports, versus World Trade Organization and overseas farmers wanting to allow imports."

The paper adds: "We suggest that an 'advocacy model' is ideal. It replaces the ideal but unavailable feedback of real experience with the ‘virtual' feedback provided by the scrutiny of experts' assessments, a neutral panel of ‘judges', informed by the scenarios and reasoning put forward by potentially hostile stakeholders.

"It gives human intuition the last word in combining the evidence to reach a final conclusion, while allowing maximum space for the use of technical methods to support it. The advocacy model works because of the psychological pressure it applies. True accountability requires that the people to be held accountable fear their judges. They must be motivated by anxiety as to what the judges' views might be."

Media contacts:

James Franklin - 0410 625 306 -

UNSW Faculty of Science: Bob Beale - 0411 705 435 -