Thursday 6 November 2008

Moore's Law and the Credit Crunch

Steve Lohr's comment in The New York Times on Tuesday November 4, titled "In Modeling Risk, the Human Factor was left out" makes for an interesting read. The blame for the financial meltdown is placed on those who thought they had covered all the angles in modelling financial markets, but who had forgotten the human factor.

However, he does not mention the providers of the technology. If guns had not been invented then people wouldn't get shot and without the development of computer technology and software, the financial engineering could not have taken place.

In 1987, the quant guys couldn’t handle massive, multi-variate models on their slow PCs, yet alone e-mail relatively small Supercalc spreadsheets to all and sundry.

Now SQL, SSPS and Excel, combined with Intel chips and the internet, give people tremendous computing power to produce models and an array of accompanying graphs to prove in glorious technicolor that all outcomes have been tested and therefore the computer’s output is right.

In addition, multiple recipients can now readily see these 50Mbyte models anywhere in the world on a 24/7 basis and be similarly enticed, thanks to e-mail, wi-fi and search engines.

As ever though, garbage in, garbage out or as Mr Lohr more accurately points out, not all the data to validate all the outcomes were input to these models despite the size of them. Forecasting, the daily returns of a large portfolio of multi-year mortgages would test even today’s laptops.

Perhaps the computer simply discourages critical, common-sense based thinking. People tend to believe everything they see on the internet. By contrast, a generation ago, people were told not to believe everything they read in newspapers.

But we've been here before. This financial crisis caused by misbehaving markets (as the models would have us believe) is a re-run of past crises of illiquid securities. The success of great minds and their models baffling mere mortals, was highlighted by the failure of LTCM which collapsed in 1998, just one year after two of its partners, Messrs Merton and Scholes got their Nobel economics prize for option theory.

Unfortunately, the financial regulators chose not to learn from this close call on the integrity of the financial system; financial engineering and its inherent weaknesses could continue unabated.

Now 10 years on from LTCM, and according to Moore's Law (chip capacity can double every 2 years) we now have around 2^5 the computing power that was available to the LTCM team. This should have given us the chance to make a 2^5 greater mess than the $4.6bn loss covered by the FED-orchestrated bailout in 1998.

So will we get away with just a 32x increase in bailout cost? It doesn't look like it at present, as markets talk in terms of trillions of dollars of CDS exposures and bank writedowns already total $702bn (Bloomberg).

But who knows, the net net cost might just be $117bn. In which case, if we escape with this low amount, then after the crisis, financial engineering can start again happy in the knowledge that in 2018, the cost of a bailout will be no more than 2^10 more than in 1998, at $4.7trillion.