Sheep don’t read barcodes

Glen Newey

  • BuyThinking, Fast and Slow by Daniel Kahneman
    Allen Lane, 499 pp, £25.00, November 2011, ISBN 978 1 84614 055 6

Habit, Samuel Beckett says in his essay on Proust, substitutes the ‘boredom of living’ for the ‘suffering of being’, and he has a point. Human existence is an acquired taste, and many of us get through it with the aid of what Vladimir in Waiting for Godot calls the ‘great deadener’. Blank simian rote – the round of feeding, grooming, ablution, slack-jawed vacancy – serves to block out tracts of time that might otherwise get colonised by anxious thought. And who wants that? Bertrand Russell said that people will do almost anything rather than think. Despite one’s best efforts, though, thoughts still sometimes come. Then, as Beckett says elsewhere, thinking can do proleptic duty, ensuring that rogue thoughts are repeated over and again, till they sink at last into the mud of oblivion.

Is thinking, pace Beckett, a good thing? A thoughtless response might be ‘yes’, but after a moment’s reflection it becomes plain that thinking is a human activity as prone to miscarry as Greek debt restructuring. Many endeavours go wrong not through lack of thought, but through our having the wrong thoughts. Human cogitative failure is a many-splendoured beast, which Daniel Kahneman has devoted his life to studying. Some goofs prove popular enough to put paid to any very sanguine view of evolutionary cognitive ascent. Humans are dab hands at some tasks, such as acquiring language and matching patterns. But we suck at others, including many that involve statistical inference. In the UK currently, the statistical likelihood of suffering serious injury from al-Qaida is many times lower than that of suffering a similar fate at the hands of one’s fridge. But few enter the kitchen cowed by the looming menace posed by their Smeg. This effect, which Kahneman likens to perceptual illusions such as the Müller-Lyer, can’t be sloughed off simply by realising that it is illusory – though some people, including some in government, don’t even get that far.

As with perceptual illusions, certain cognitive snafus seem immune to willed control. The base rate fallacy, targeted here by Kahneman, is another notorious instance. The fallacy lies in wrongly inferring from the fact that an investigative procedure has a certain statistical accuracy in relation to a defined group – for example, those who have a disease – that the same accuracy will obtain for the entire sample population. Suppose there is a test for a rare form of cancer. The test is pretty good, but less than perfectly accurate. When taken by those who will get the disease, in 99 cases out of a hundred it will be positive. But one test in a hundred will produce a ‘false negative’: it will give a clean bill of health to someone who will in fact get the cancer. Among those who don’t have the cancer, the test gives a correct negative diagnosis 99 times out of a hundred, and a false positive in 1 per cent of cases. Suppose you take the test, and it comes back positive: how worried should you be? The test might be said to be 99 per cent accurate; so does that mean there is a 99 per cent chance that you have the cancer? No. To draw that conclusion would be to commit the base rate fallacy. The arithmetic becomes clearer when the cancer is very rare indeed. Suppose it’s so rare that only one person in a million will get it, and that the total population is one hundred million. This means that of the hundred or so people who will get the cancer, some 99 should be correctly identified as having the disease. But if the whole population takes the test, the number of people mistakenly identified as having the cancer will be much larger: around 1 per cent of a hundred million, or one million. Of the two groups, the correct positives and the false positives, you are much more likely to be in the second group. If your test is positive, there’s about a one in ten thousand chance that you’re in the first group, those who have the cancer. That’s worse than the rate for the population as a whole, but it’s a whole lot better than the 99 per cent chance of having the cancer you’d imagine if you succumbed to the base rate fallacy.

In underlining the pervasiveness of the fallacy, Kahneman doesn’t spare himself: he recounts his own jittery road use in Israel during heightened alarms over bus bombers, despite the much higher probability of perishing in a run of the mill car crash. If there is a surprise here, it is not so much that people unaware of the base rate fallacy fail to estimate the accuracy of a diagnostic test. It’s that even societies without much overt censorship amplify certain threats over others without statistical warrant. Some of this is down to snake-oil salesmen calling forth the ailments they pretend to cure, but they couldn’t pull it off if they weren’t exploiting a well-ingrained predisposition. Politicians seize on statistics in the hope that they will defuse opposition to their schemes: it would be uncharitable to David Blunkett to suppose that he really was as naive as he pretended to be in pronouncing biometric ID foolproof. The more disturbing fact is that he could say it and expect to be believed.

You are not logged in