Faith, Hope and Probability
- The Taming of Chance by Ian Hacking
Cambridge, 264 pp, £27.50, November 1990, ISBN 0 521 38014 6
The author of The Emergence of Probability (1975) has written another formidable book on the history of probability theory. The first described the development in the 17th and 18th centuries of a new way of legitimating knowledge: a mathematical theory of predictability under uncertainty based on observed frequencies of numbers on thrown dice. From its origins in gambling, probability theory began to meet the demand for a reliable form of authority that would release the Renaissance and the Age of Reason from religious claims to control knowledge. When it had seeped from games and mathematics to marine insurance, and thence moved on to produce what Ian Hacking calls an avalanche of numbers in every kind of public concern, established theories of causality were ready to be toppled. As he said in the earlier book, the world was about to become safe for future Galileos. But the change was slow, and not in time to divert Descartes from his project to establish in reason itself an independent arbiter for truth (and, we can add, not in time to save ourselves from a Cartesian world divided radically between primary and secondary qualities).
The present volume takes the 18th-century developments in statistical theory to the end of the 19th century. Now we are so used to thinking statistically that we hardly notice how much we are besieged by politically serviceable numbers, averages and chances. The process Hacking traces has produced a sea change for our culture. It would be good to be able to look forward to a third volume, bringing the same critical clarity to bear upon the uses of probability in present-day politics. But from what he tells us, that is evidently not on the immediate agenda. The Taming of Chance is not offered as a history, though it has a chronological framework and a narrative to unfold. As a book of philosophy it is evidently part of a deep-planned and evolving programme on the legitimation of knowledge. The programme has already included an edited volume on Scientific Revolutions (1981), together with Representing and Intervening: Introductory Topics in the Philosophy of Natural Science (1983) and writings on classification (‘Making up people’, 1986) and on language. The next major work, he tells us, is to be on styles of reasoning.
This volume therefore deals with philosophical issues using the development of statistics as its main illustration. The central issue is induction. He is going to argue, he tells us at the beginning, that a style of reasoning is self-authenticating. The idea sounds innocent enough: a proposition can only be assessed as true or false when there is some style of reasoning that sets the questions and provides ways of answering them. We learn that there is no way of settling the truth of a proposition without an established style of reasoning with its accompanying style of investigation. On closer acquaintance, this idea is far from innocent: we have to recognise that a style of reasoning cannot be either right or wrong, because it makes its own rules for fixing the sense of what it investigates. It is the actual form of the reasoning process.
The laws of nature underlie the causes which science from its beginning sought to uncover. But as the 19th century progressed, causes were put into question by probability analysis. At first, statistics were in the ancillary role – what we would have once called the handmaid of science, but which with proper regard to gender decorum Hacking calls the role of a loyal Victorian valet. At the beginning of the story, statistics serve: they do not dominate the quest for the laws of nature, far less do they replace them. Half-way through, statistics turn out to have laws which seem to rival those of nature. By the end, the laws and causes have been toppled and quantum theory has arrived. We are faced with a world governed by pure chance. Committed as he is to the principle that change comes by a multitude of tiny trickles. Hacking understandably decides to take the long scenic route, citing an avalanche of names of thinkers who contributed to the eventual flood.
First the readers have to be convinced that an extraordinary change in the manufacture of numbers took place. The pressure to collect figures comes from a new need to bandy them in politics and administration. Evidently, once the resource has been discovered, like transport or electronic communications, every one needs to use it. Much is at stake, so any official number is liable to be disputed, which further stimulates the industry of producing statistics and the work of arguing about what they may mean. Hacking insists that the nation states’ massive reliance on numbers was the necessary precondition for advances in statistical theory. Modern industrial society was the only platform from which the new style of reasoning could have emerged. It is interesting to note that the initial advances in theory so important to science came in response to social problems. This will partly account for the politically-fired metaphysical speculations foisted upon developments in methods. Statistics grew up in the centre of a public storm about whether the universe was a deterministic system of laws, or whether it was the product of the operations of blind chance.
Taking us nimbly through the thickets, the tour guide’s sprightly manner never flags; each answer produces a new problem, we travel swiftly from one panorama to the next, stunned by the speed of the turns. In such a well-crafted book a dazed reader is no doubt an intended result. Before the last chapter we have become numbed to the big issue that has interested serious thinkers for so long. Are we free, or are we determined? Is the universe subject to law or subject to chance? If at the end we find ourselves cool about a problem that somewhat interested us when we started on the tour, it is not because we no longer care. On the journey we have come to doubt whether the great metaphysical issues should ever have been loaded on to methodology. Now we know enough to wonder what sort of style of reasoning would be able to fix the meaning of the debate, still less settle it. Tourists often become pilgrims: the journey bestows a new openness, preparing the readers for the heady dénouement at the end when chance is somehow tamed.
It is curious that in this history the more sophisticated questions and technically refined answers come first, while the crassest misconceptions come up later. Problems of constitution-drafting at the time of the French Revolution inspired Condorcet’s theory of majority voting and his famous paradox. Questions about the reliability of witnesses are central to civil society: credibility of witnesses had been much discussed in the 18th century in connection with accounts of miracles on which the claims of religion were made to rest. There was therefore a well-tilled ground for applying statistical science to conviction rates of criminals, the ideal size of a jury, and the right size of a majority vote. The 1789 constitution-makers rejected the English jury system of unanimous decision as another example of British hypocrisy: how could 12 jurors ever agree unanimously unless intimidated? They were alert to the difference between wrongful conviction and wrongful acquittal, but opinion differed as to which was the more reprehensible. Laplace felt that a 30 per cent chance of executing an innocent person was unacceptable; Poisson was tougher, feeling that executing two innocent out of seven accused was par for the course, and recommended decision by simple majority. Such questions are always with us: in 1967 the US Supreme Court declared it constitutional for jurists to decide by a majority, and in the same year England allowed conviction by a ten-to-two majority.
Famous conundrums enliven the story. Is a jury that splits 12:0 likely to give a more reliable verdict than one that splits 112:100? Intuitively yes: the evidence must have been clearer to produce the unanimity and more confused if 100 jurors could line up on each side. But how is it that the English House of Lords, when it constituted itself as a court to try one of its members, was content with a majority of merely 12 in a jury of 600? Would it make a difference to the outcome that in this case the jurors have long acquaintance with each other? The legal issues open the question of reliability beyond mere head-counting. The early 19th-century mathematicians realised that it is necessary to have a test of the difference between the actual rates of convictions and the proportion of objectively correct convictions.
The 18th-century subtleties in applying statistics to legal issues contrast with the 19th-century muddles in applying statistics to health and morals. The point is relevant to the secondary theme of the book: the influence of styles of reasoning. First, vulgarisation probably tends to retard theory. Second, a powerful difference between the legal and the medical profession shows up: it may not be surprising that the lawyers indulged and the medical profession systematically resisted the applications of probability. Lawyers love recondite logic-chopping. Medical men, however ready at first to apply statistics to public health, came to reject statistical applications on the familiar grounds that each individual case is different. But they had reason for caution once the numbers explosion had taken place.
Numbers were originally collected for the Government’s simple purposes of maintaining revenue and military strength. Then the collection of numbers created a new motive, what Foucault called bio-politics, information aimed at controlling classes of people or the whole society. It was important for private and government actuaries to know how to predict rates of sickness, and between 1820 and 1840 numerical regularities about disease became commonplace. As numerical data grew on all sorts of topics, a strange new fact appeared. Instead of changing from year to year, the returns were extraordinarily stable. The story starts with trying to understand why birth rates or crime rates should show so much constancy. Why should suicide rates, the numbers of insane and the numbers of emigrants and immigrants not fluctuate unpredictably? What did such constancy mean?
On loose analogy with Newton’s laws, any discovered regularities were called laws; there were laws of the human body and laws of behaviour. Laws at this stage were any equations with some constant numbers in them: the motto was, collect more numbers, find more regularities, and discover more laws. Hacking cites in caricature of the movement Adolphe Quetelet’s calculation that the lilacs in Brussels bloom when the sum of the squares of the mean daily temperature since the last frost adds up to (4264°C)2, a truly useless discovery. Crazy figures, without hypothesis or control, were bound to be discredited when powerful institutions needed to defend themselves against statistical attack. Hence perhaps the reticence of the medical profession towards the claims of the new science.
Two major themes in the history of statistics provide material for a rich philosophical commentary. One is the idea of ‘autonomy’ for the laws of statistics. Hacking makes the useful distinction between looking to statistical laws for prediction and looking to statistical laws as in themselves the explanation of what the numbers show. A mathematical theorem cannot be regarded as a law in nature. He provides many examples of attributing ‘autonomy’ to statistics: the reckless use of the law of large numbers for finding facts about human behaviour, use of the central tendency theorem and the distribution of error to discover new real existences, and the use of the statistical norm to find the right behaviour, and deviation from the norm to find pathological behaviour.
Styles of Reasoning
The other major theme is the contrast between styles of reasoning which arise in different communities. A style of reasoning or a thought style is a metaphysical infrastructure which has developed along with assumptions about politics and morals. The topic is of central importance to the history and philosophy of science because the thought style influences the very process of classification and argument. It is the seat of deep-seated bias, the source of prejudiced approval or rejection, the unnoticed principle of selection. In this volume Hacking follows a distinction that has been made between two styles of reasoning, Eastern (Prussian) statistics and Western (French and English) statistics, developed in two contrasting types of civilisation.
The Eastern civilisation is collectivist, holistic, group-centred in its thinking. It is based on a philosophy by which the group confers identity on its members and is responsible for them. From this tradition Prussia was able to create the first welfare state, and to introduce workmen’s compensation. For the Germans the idea of a law is essentially a social product. Laws are not the kind of things to be inferred from individuals: they belong on a different level of existence and cannot be distilled from individual behaviour. Consequently one is not surprised to find a German thought style predisposed to determinism in science.
The Western tradition by contrast is individualistic and libertarian, on the French side upholding revolutionary ideals against centralised oppression, and on the English side upholding free-market against state-imposed economic constraints upon industry. The contrast is flattering to the West: on the one hand, the communitarian commitment to the centre; on the other hand, the love of freedom. The Western political philosophy starts with individuals constituting the sovereignty of their king; law is the product of the will of individuals. It is much easier in the West to accept the idea of probabilistic laws. The facts about crime and conviction can be regarded in this tradition as statistical laws, because social laws are constituted by the acts of individuals.
The characteristically state-orientated idea of law in German thought is represented by Boltzmann’s deterministic theories in statistical mechanics, whereas on the individualistic Western side of the divide Maxwell was ready to find that statistical mechanics was indeterministic. On the other hand, there is the paradox that the liberal intellectual atmosphere is generally more congenial to the idea that there are statistical laws. Statistical laws have the effect of reducing pure chance to regularity. The style of reasoning that is open to statistical laws has to stomach a new deterministic theory.
The alleged connection between political and scientific thinking could be better spelt out. Hacking implies a kind of cognitive economy by which the acceptable political model provides an acceptable analogy for science. This would be convincing if the alignment of reasonings about law were simple, but it gets to be implausibly complex. Western statistics support the autonomy of statistical laws, Eastern statistics have been more wary. Is it plausible that the holistic, collectivist civilisation is more hostile to an idea about laws governing collections of numbers than the individualist, liberal civilisation? The match of politics and science has been achieved by postulating that the liberal regards law (in the political sphere) as the product of the will of individuals, and so is happy about finding laws of nature constituted by countless individual choices. The liberal is readier to think that statistical laws are laws of behaviour, the conservative more cautious. Hacking worries a little that the liberals look more kindly on a theory that is deterministic about individual choices. The fatalist determinism espoused in the Western camp is a surprise to the theory of styles of reasoning. Our guide takes a furtive peep at his map. He notes that statistical fatalism rides uneasily on the liberals’ view of free individuals as the source of their own laws. It would have been better the other way round.
Guides generally liven the trip with little jokes at the expense of some public monuments: Quetelet and Durkheim have been regular butts for this tour. Adolphe Quetelet, the Belgian Astronomer Royal, takes the prize for attributing to mathematical theorems reality in nature. Quetelet was dazzled by the idea of the bell-shaped curve displaying the mean of a distribution, and went ahead to measure the characteristics of all sort of populations, races, nations. He and his followers often had no other authority than the bell-shaped curve itself for attributing homogeneity to the population being measured. His error was to use the measure of dispersion to decide whether he had got a unity to study, and thus he unleashed a rash of speculative statistics.
From this account Quetelet would seem to be the landmark figure in the history of social statistics who opened the path to comparison of abstract quantities such as IQ, heredity and styles of reasoning. Whereas Quetelet depended on the clustering of features around the mean, Galton was more interested in explaining rare events in heredity, and postulated that physical traits were carried in an element that could be traced from one generation to the next. To find his invented hereditary feature he focused on the tails of the bell-shaped Normal curve. Regression and correlation techniques of calculation and the law of deviation developed.
The exposition, generally so clear, becomes opaque as Hacking struggles to explain the nature of the great leap from measuring quantities that ‘really existed in nature’ to measuring all sort of other entities. In these pages (107-109) we are constantly tripping over ‘real quantities’, ‘quantities that really exist in nature’, ‘objective properties of a population’, real physical unknowns’, as distinct from something else, such as national longevity. The language makes it unclear whether Quetelet was wrong to use the measuring device as sole guarantor of invented social facts, or wrong to open the measurement game to invented social facts. The implied distaste for the latter, probably not intended, is justified when it came to inventing a Jewish threat to German national purity by reporting massive Jewish immigration into Germany from Russia without noting the simultaneous massive emigration. The reader gets the feeling that but for statistics used to make up people there would have been no such invention, or that it would have been ineffectual for lack of statistical authority. But this would be special pleading: witches and vampires did not depend on statistics for their invention, nor miasma nor pox, nor the discovery of a national style of reasoning addicted to hypocrisy.
Hacking is also very scathing about the misuse of the idea of the norm. Although he is fair enough in recognising that the association between the mean and the good goes back to Aristotle, he relies on his nose for smelling out improper use of statistics.
Normal and Pathological
Having a joke about the public monuments is one thing, defacing them is quite another. Durkheim’s image in Western sociology has already been rudely scrawled upon, especially in deriding his use of the concept of normal. One might have expected a full treatment from Ian Hacking to have corrected the mistaken reading. Hacking tries to locate Durkheim in the Western class of styles of reasoning, as a conservative Utopian who could not evade being immersed in the atomic individualism of the age. If the West is where Durkheim belongs, styles of reasoning can be anything and everything.
Yes, Durkheim was French, born in Alsace of a Jewish family, educated to be a rabbi. In spite of his later education, his thought places him incontestably with the holistic, conservative tradition associated by Hacking with Eastern statistics. Here is a thinker who deeply believed in laws of society. Moreover, he believed in styles of reasoning, and styles of categorisation. He actually compared two styles of reasoning, one holistic and conservative, the other empirical. Neither was normal; neither a deviation from a norm constituted by the other. He did not use statistics to find his types, but worked them out intellectually first. His initial inspiration came from his conviction that the typical style of reasoning of the English empiricists was wrong. His whole work was a consistent attack on the Western style of reasoning about society, committed to an atomistic, individualistic conception of the person.
Hacking quotes several sociologists who, like himself, pay selective attention to the book on suicide and its antecedents, but consider the work on classification and collective representations as belonging to a different corpus, and take licence to ignore it. He gives a laboured account of how Durkheim is supposed to have changed his mind radically in the one year between his first and second book, whereas the themes in the second had been clearly presented in the first. Durkheim’s central project was to explain solidarity: he postulated two kinds, one based entirely on shared collective representations (his term for styles of reasoning), and the other based on the division of labour. In disagreeing with the English free-market sociologists, he pointed out the extreme fragility of solidarity in a society based entirely on economic exchange, anticipating some of Schumpeter’s arguments on the internal weakness of capitalism.
No serious reading of Durkheim can give the impression that he thought the division of labour was normal and therefore good, even though he did unequivocally say that modern society would disintegrate without the division of labour. For explaining his theory of the social bond Durkheim used the statistics of suicide to demonstrate a point about how the public system of classification may hold the mind of the individual in its grip and how, with increasing division of labour in industrial countries, the grip of shared symbols is loosened. Durkheim failed to get the British empiricists to accept that explanations of individual cognition and behaviour would need to seek causes at the level of social interaction. The absurdities of individualistic analyses of risk perception in contemporary psychology demonstrate his failure to have his theory of styles of reasoning taken seriously. There is in the West a strong bias against mixing sociological grand theory with empiricism. Durkheim attacked ‘Queteletisme’, but here his work is paired with Quetelet’s failures. Analysing cultural bias, or styles of reasoning, calls for fairness, even for charity. Alas, after promising a voyage around different styles, Hacking goes home and invites us to be his guests. But then, he has told us frankly that styles of reasoning will be the object of his next research.
In default of other explanation Hacking implies that the differences between the Eastern and the Western styles of reasoning derive from the way the two civilisations have used law in politics as a metaphor for law in science. What is the guarantee that we have located the relevant metaphors that explain the bias of Eastern and Western statistics? Anthropologists influenced by Durkheim’s theory of the sacred would expect the top-down habits of the centralised paternalistic Prussian system to produce different kinds of figures from those analysed in the bottom-up practice of local lobbyists, Friendly Societies and actuaries, and to scrutinise them for different discrepancies. It is a matter of praxis, not idealism. A country that is organising for political control by popular representation will place its auditors at different key points in relation to a centralised bureaucracy. Durkheim taught that collective representations depend on what is held to be sacred, as shown by the pattern of accountability. Curiosity tends to flow where it is paid to go. It sounds obvious, but it changes the kinds of questions that can be addressed to styles of thought.
A long and absorbing chapter on Peirce winds up both major themes of the book. The argument has always been edifying, now it becomes mystical. The conflict between determinism and indeterminism ends in a moving reconciliation. All is chance, but because chance reveals approximate laws it is tamed. The possibility of sound induction is saved by explaining an apparent tautology. A style of reasoning cannot be judged right or wrong. Why did we resist the idea that truth and scientific method are linked by circular definition? It is only circular in the sense that the process of finding the facts is the same process as judging their correctness. Science is the product of the mind, and nature as we know it is the product of science. What else can it be? Peirce rested the claims of science on faith, hope and charity. Amen, and thanks for a splendid tour de force.