Saturday, January 31, 2009

Bad banking drives out good

I heard this quote from the CEO of Standard Chartered Bank on Radio 4's Today programme this morning, and it summarises the problem very well. (Its also a reference to Gresham's Law.)

Its become a commonplace to simply blame the credit crunch on "stupid, greedy bankers". The trouble is, suppose you were a smart, frugal banker five years ago. Your exceptional brains let you see that subprime self-certified mortgages were a bad idea, and therefore anything built on them was clearly going to collapse. Therefore you weren't going to invest in that part of the market.

The trouble is, you would have had years of below-average results. Your clients and employers would be looking at your competitors and pointing out how bad your results were. Even if your clients were prepared to listen to your explanations, many are under a legal obligation to maximise their investment returns, and their customers will be less sophisticated and correspondingly less willing to listen to complicated explanations about why their pension is so much less than their neighbours.

So you get removed from your job, and someone else is put in who is willing to do what it takes to deliver market returns.

Looked at from an abstract point of view the game resembles a "Tragedy of the Commons". A banker who gets good returns while taking hidden risks is like the commoner who puts an extra cow on the common pasture. He gets richer while all his neigbours get a little poorer. So everyone else is forced to do likewise, even though everyone can see that collectively this is going to lead to ruin. And so the game goes.

The traditional solutions to the Tragedy of the Commons are:

  1. Property rights. Divide up the commons into individual plots and allocate them to the commoners. Failing that, give each commoner a tradeable right to put exactly one cow on the commons.
  2. Regulation. Make a law that each person can only put one cow on the commons.
The question is, how to apply this to the finance industry. In this case cows are equivalent to risk. Banks have had "risk management" departments for years, but risk turns out to be very difficult to measure. Its as if our commoners have invisible cows: verifying the numbers is impossible, although I've previously posted a suggestion for making the cows more visible.

So without this, what can we do to prevent the next big bust? It doesn't look like there is anything. As memories of this bust fade and the people who experienced it retire, a new generation of bankers will come along with a new ingenious method for hiding risk, and once again bad banking will drive out the good until the whole thing collapses once more.

Monday, January 26, 2009

Wikipedia and Britannica are Converging

Is there a network equivalent of market forces? The crowdsource equivalent (or perhaps generalisation) of Adam Smith's "Invisible Hand"?

When I was in my early teens my school got the new edition of the Encyclopaedia Britannica in over 30 volumes costing around £1,000. It was a huge wealth of authoritative information on just about everything. I spent quite a few lunch hours just reading about stuff.

In the 90s Microsoft bought out Encarta on CD-ROM. It was a lot less information, but it cost a lot less than £1,000. Britannica has been in trouble ever since. Now there is Wikipedia, which is free, a bit less authoritative than Britannica, but has even more information and a great deal more mindshare.

So Britannica has responded by taking a page out of Jimbo Wales's book; its going to start accepting user-generated content, albeit with a stiff editorial barrier to acceptance into the core Encyclopaedia.

Meanwhile the latest controversy on Wikipedia is about whether user-generated content needs more controls. I say "more" because Wikipedia has always had some limits; locked pages, banned editors, and in extreme cases even the deletion of changes (as opposed to reversion, which preserves them in the page history). Increasingly sophisticated editorial policies have been put in place, such as "No Original Research" (originally put in to stop crank science, but now an important guiding principle).

When you look at Wikipedia's history there is a clear trend; every so often Wikipedia reaches a threshold of size and importance where its current level of openness doesn't work. At this point Jimbo Wales adds a minimal level of editorial control. Locked pages, for instance, were added because certain pages attracted unmanageably high levels of vandalism.

It seems that Wikipedia and Britannica are actually converging on the same model from different ends of the spectrum: Wikipedia started out completely open and is having to impose controls to manage quality. Meanwhile the Britannica started out completely closed and is having to allow in user-generated content because comissioning experts to write it is too expensive. Somewhere in the middle is a happy medium that gets good quality content without having to charge subscribers.

Not charging is important. In these days of the hyperlink, an encyclopedia is not just a source of knowledge, it is a medium of communication. If I want to tell people that they should read about the history of the Britannica then I refer to the Wikipedia page because very few people who read this are going to be able to follow a link to a Britannica article (even if I were a subscriber, which I am not).

Wikipedia is often said to have been inspired by the "open source" model in that anyone can edit Wikipedia just as anyone can edit the Linux source code. In fact the cases are not parallel. The GPL allows me to download a copy of the Linux source code, hack it to my heart's content, and give copies of my new version to anyone who wants them. What it does not do is authorise me to upload my version to kernel.org and pass it off as the work of Linus Torvalds. Getting my changes into the official kernel means passing through a strict quality assurance process including peer review and extensive testing on multiple architectures.

So I think that this proposal to create "flagged revisions" for editorial review moves Wikipedia towards the open source model rather than away from it. Anyone will always be able to fork Wikipedia if they wish: the license guarantees it. But the offical version at wikipedia.org will earn increasing trust as the quality assurance improves, just as the official version of the Linux kernel is trusted because of its quality assurance.

Wednesday, January 21, 2009

The next challenge for Linux

I was in the local branch of "Currys" (UK electrical and electronic goods chain) recently and had a look at the line-up of computers. They had some little netbooks, and taped next to each one was a little note saying something to the effect of "This runs Linux, so it won't run Windows software". It was a local version of a wider story about Linux:

  • People buy netbooks and then discover that Windows isn't part of the bundle, and they don't like it.
  • A teacher found a student handing out Linux CDs and accused him of pirating software.
  • A student accidentally bought a Dell laptop with Ubuntu instead of Windows, and complained that she couldn't submit the Word documents required by her college (they say they are happy to accept OpenOffice.org documents) and the installation CD for her broadband service wouldn't install so she couldn't get the Internet (Ubuntu connects to the Internet just fine without it). The local TV station picked up the story as a "consumer rights" case and was amazed to find a virtual lynch mob chasing them for being less than enthusiastic about Ubuntu. They quoted an expert saying that Ubuntu "isn't for everyone" and is more suited to tinkerers than people who just want to get stuff done.
Behind all this is what "everyone knows" about computers (everyone who isn't interested in computers for their own sake, that is):
  • There are two sorts of computers: Apple Macs, and PCs.
  • Apple Macs run Apple software. PCs run Windows software.
  • Windows is bundled with PCs. On expensive PCs the bundle might include Office as well.
  • If you want extra software, you buy it at a store, take it home and stick the CD in the drive.
This is how its been for almost 20 years (25 if you count the MSDOS era). An entire generation has grown up knowing that if its a PC, it runs Windows. They know it in the same way they know the sky is blue: its always been blue.

Of course those of us who use Linux know different. But for most people, Linux is something they might have heard about once or twice, but they didn't pay any attention and couldn't tell you anything about it. Outside its current user base and their immediate circle of friends and family, Linux has zero mindshare.

This is not another lament about Joe Sixpack being too stupid to understand Linux. The problem is not that Linux is too complicated, its that Linux and Windows do things differently. Imagine someone who was raised on Linux; how would they react to Windows? Software installation would seem complicated and geeky, the single desktop would feel claustrophobic, and as for the idea of paying for software...

So I think we need to sort out a message to broadcast to the world and then focus on it. I suggest the following:
  • Linux is an alternative to Windows and Apple. It comes in several flavours.
  • Linux isn't easier or harder than Windows, but it is different, and it takes a while to get used to those differences.
  • Linux is bundled with a huge range of downloadable free software, including office, graphics and photographic packages.
So next time you see a story that misrepresents Linux please send a correction with these three points.