Turing’s Man: Western Culture in the Computer Age 
by David Bolter and A.J. Ayer.
Duckworth, 264 pp., £12.95, October 1984, 0 7156 1917 9
Show More
Show More

A computer is a tool, working the intentions of its designer or user. It is no more malevolent than the village clock whose chimes wake us in the night, or the car whose failed brakes run us down. We invest it with personality because it is an instrument of the mind, rather than of the hand. It extends and mimics the very function that has always seemed to distinguish us biologically from other organisms – the capacity to reason. At times, it almost seems as if, inside the black box, there is one of us. Computers are humanoid, too, in their versatility. Almost any computer can be instructed to do almost any one of the enormous variety of different things that computers in general can do. There has been nothing to equal it since the abolition of slavery.

From the willing slave to the impassive robot is a small step for the imagination, but a big step for the art. Philosophers have insisted for centuries on the sovereignty of reason among the faculties of the mind. In 1936, Alan Turing proved that all formal reasoning could – at least in principle be represented in terms of the operations of an extremely elementary device akin to a digital computer. That was just a piece of highbrow logic, for such devices did not then exist: but the way to ‘artificial intelligence’ – ‘AI’ – was theoretically open. By 1950, when the first primitive electronic computers were already ticking and buzzing, Turing had turned his old argument around, and asked by what practical test an immensely elaborated logic-chopping machine at the other end of a telex line could be distinguished from a human reasoner.

Bolter is somewhat scornful of the AI movement. He sees it as yet another instance of the age-old project of ‘making a human being by other than the ordinary reproductive means’, by which man seeks to ‘raise himself above the status that nature seems to have assigned him’. The Greeks, for whom the essence of being was in the outward form, dreamt up the fable of the sculptor, Pygmalion. Cartesian man, obsessed with mechanism, would fashion for himself a clockwork automaton. Nowadays it is the disembodied mens sana we would ape, torn from its rightful place in corpore sano. The ostensible goal of AI is clearly unattainable in a practical sense. Whether it even produces useful applications along the way is beside the point. In Bolter’s opinion, the serious consequences of the AI movement are not technological: ‘By promising (or threatening) to replace man, the computer is giving us a new definition of man, as an “information processor” and of nature as “information to be processed”. I call those who accept this view of man and nature Turing’s men.’

David Bolter is a Classical scholar although he has a degree in computer science. He writes engagingly, encompassing the sweep of history right back to the Greeks.

Computer technology is a curious combination of ancient and Western European technical qualities. Developing through modern science and engineering, it nonetheless encourages its users to think in some ways in ancient technical terms. Turing’s man has in fact inherited traits from both the ancient and Western European characters, and the very combination of these traits makes him different from either. Those of us who belong to the last generation of the Western European mentality, who still live by the rapidly fading ideals of the previous era, must reconcile ourselves to the fact that electronic man does not in all ways share our view of self and world.

The majority of readers of the LRB will no doubt sigh regretfully in sympathy, but if I read Bolter correctly, it is not only the humanist who is thus challenged. ‘The issue is not whether the computer can be made to think like a human, but whether humans can and will take on the qualities of digital computers.’ There is no reason not to replace ‘human’ by ‘scientist’ in this statement.

Most people believe that computers do already ‘think like scientists’ and that scientists are able to ‘think like computers’. Scientific pundits have a touching faith in the efficacy of early computer training as a basic education in scientific method. There is no doubt about the scientific origins of electronic computing. The hardware of all electronic technology came originally from pure science, guided into foreseeable applications. The systems and the software evolved to deal urgently with the very big numerical problems of science-based weaponry in the Second World War. The computer did not create the scientists’ hunger for gargantuan calculations and continually fails to satisfy that growing appetite. The most advanced computers now are still those devoted to scientific problems.

The scientificity of a computer is popularly associated with the logical precision of the instructions that have to be given to it. It is that most infuriating of creatures, the perfectly literal servant, who fails to understand what one ‘really’ had in mind when one instructed it to clear all the dishes off the table. This perfect formalism is as intrinsic to its workings as the precision of the gears of a mechanical clock, or the rigid drilling of an 18th-century army. Every computation has the detailed transparency and overall opacity of algebra. Algebra is effective because each of its manipulative steps is perfectly clear and absolutely compelling. The logical validity of every step along the way makes the result of a long and complex calculation equally compelling. But that result cannot be immediately ‘obvious’, for otherwise we should not have had to rely on symbolic or electronic techniques to work through the argument. In so far as computing is strictly logical, and formal logic is mathematical, and mathematics is a science, then all computing is, indeed, ‘scientific’.

There is also a vast amount of science in the marvellous combination of physics, chemistry and engineering that makes up the chips and circuitry. But this ‘scientificity’ of the inner workings of the black box need be of no more concern to the user than the ‘scientificity’ that the physiologist can discern in our own heart and lungs. These organs, too, obey a natural logic, perfectly consistent with their material construction, but our only concern is that they should perform their functions unattended.

At the other extreme, considered as a component in a larger mechanical, industrial, commercial or military system, a computer is no more ‘scientific’ than the clerks and messengers who used to ‘process information’, ‘collect and collate data’ and ‘transmit decisions’. If it is ‘friendly’ enough to us we see and act through it, as if it were an instrument of our will. Tapping away at this word processor, where every letter or paragraph can be rubbed out and corrected, is more like writing with a pencil than using a typewriter. Give Turing’s men another twenty years, and they will invent a speech-recognition device to revive the oral spontaneity of the Greek poet. Simply operating a computer is not particularly ‘scientific’: but writing programs for it might be. This is a highly disciplined craft, in which there can be no breaking of rules, no transgressing of boundaries of time and memory space. If Euclid, algebra, and Latin grammar were good exercises for the intellect, then programming is the ideal training for the convergent scientific mind. Was that also characteristic of Greek pottery or temple architecture, as Bolter suggests? However, this is to see programming solely as technique, as an entirely self-contained glass-bead game. Weave and embroider the electronic ‘messages’ as you will, they convey nothing unless given meaning by the programmer. It is the nature of the inputs and outputs that we are concerned with, and especially the language in which they are communicated. Bolter traces our attitudes in and towards language, and invokes the magic to be found in the diversity and uncertainty of reference of a natural language. As Plato appreciated, poetry is an oral medium, cooled by writing; Aristotle’s logical analysis froze the meanings into forms. A modern computer language such as FORTRAN or COBOL seems to realise Leibniz’s dream of an artificial language in which all ambiguity has been defined out, all relationships are as precise as they are in algebra itself. That project failed, because it could only be used between humans, about topics that humans took an interest in: now, with computers to commune with, just such languages have had to be devised. Is that how Turing’s scientist must talk and think, totally impervious to poetry, puns or passion?

Formal computer languages certainly affect our attitude to natural language. Chomsky would have every sentence parsed through the computer’s logical processor, and transformed into its deeper structure. Linguistics, the epitome of a humanistic science, is reduced to its scientistic skeleton. Bolter is right to rejoice at the failure of the project for artificially translating natural languages into one another, which exposed the nakedness of Chomsky’s pretensions. In reality, that project has to be turned on its head: the programmer ‘translates the language of the outside world into computer language, and then the computed results back into a language that the world can understand and use’. He or she is in the position of the scientific theorist, trying to bridge two domains that have to mirror one another in some important aspects. One is the domain of theory – a closed, formal, precisely charted domain of abstract mappings and models. The other is the open, undifferentiated continuum of the phenomenological life world, as we observe it and live in it. Like successful science, ‘successful programming is a movement between two modes of thought, modes so different that we cannot yet determine whether the most compelling human problems can ever be expressed in the formal language of computers.’

Turing’s scientist would generate a simulacrum of nature within the abstract domain of his or her computer and set it running to see how things turn out. The dynamism of computer action is fascinating – but is it true to life? It could not be. No mapping could represent reality in all its detail and extent, except reality itself. The life world is continuously variable, real, infinite and totally connected. A digital computer program lacks all these qualities. It is ‘discrete, conventional, finite, isolated’.

Bolter’s objections on this point apply equally to all abstract schemes – that is, to all scientific theories. There has had to be a selection of qualities and aspects to represent and model. Boundaries have to be drawn, approximations sharpened into exact formulae, and statistical associations ruled to be rigorous laws. In that respect, a computer model of an aspect of nature is not different in principle from any general hypothetical proposition, to be manipulated and tested for its implications. The scientist who sets up a computer simulation to explore the implications of a theory is acting within the same tradition as Newton, who had only his new mathematical calculus and long arithmetic to do the same. Turing’s scientist may, of course, be tempted to overelaborate his model, but that has always been the weakness of the zealous theoretician. More significantly, he is bound to set up models that are easily computable, and to represent nature in terms of such models. The ‘information processing’ aspect of a human being is now emphasised above the ‘mechanical’ aspect, because our computers are more powerful now than our clocks. Terms of art from computer technology appear naturally in scientific discourse, naively as fashionable jargon, but more profoundly because some of these terms stand for novel themata out of which new scientific theories might be made. There is no reason why we should not explore the imaginative realm thus opened up.

Nevertheless, Bolter is right to say that all computer simulation is essentially superficial. This metaphor of lack of depth relates to hierarchies of pattern and meaning. Consider one of the successes of computational effort – weather prediction. Immense quantities of elementary data – temperature, pressure, wind velocity, measured at a network of points around the globe – are assembled and made to interact with one another according to their physical nature. The simulacrum lives its own life at the rate of, say, six hours per half-hour of computing (I am reminded of a bleary-eyed Fred Hoyle complaining of loss of sleep whilst running the life-history of a star on EDSAC II at the rate of a billion years per week). We soon get a remarkably good forecast of the weather map as it will be in a couple of days’ time. But look at that map, and see that all that has happened is that a ‘depression’ or a ‘ridge of high pressure’ or a ‘front’ has moved in from the Atlantic and across the country. Could the computer have told us just that? Perhaps – but only because humans, with their capacity to recognise such patterns as distinct entities, would have told it to do so. Computer thought knows only its own prescribed level of generality: it has no insight into larger schemes.

The ultimate challenge of AI is to construct a simulacrum of man. But even if such a golem were feasible, it would surely lack one of the primary characteristics of all biological organisms: individuality. ‘Memory,’ Bolter points out, ‘might be the key to artificial intelligence,’ for ‘we live in the world we remember.’ It is quite possible, as Bolter indicates, that Turing’s man, in our present conception, is facing in exactly the wrong direction. The more computers come to think like scientists, the less scientists need to think like computers. The practicable project is to generate more and more powerful versions of ‘synthetic intelligence’, where the computer is used as a tool, directed and manipulated by a human intellect, strengthening those capacities for encyclopedic knowledge, instant recall, unchallengeable logic and tireless assiduity which such instruments possess. The generation we are training in our schools and amusement arcades have learnt to use these tools as extensions of their brains, just as they ride their bicycles as extensions of their muscles and nerves. They may turn out to be freer than any previous generation to exercise their individuality and imagination.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address, and a telephone number.

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences