This book is about its subtitle: ‘A History of Explanations in Psychology and Physics’. To bring that history up to date, one should point out that this year’s Nobel Prizes in Medicine went to three men honoured for their contribution to our knowledge of the brain: Roger Sperry from Cal Tech and David Hubel and Torsten Wiesel from Harvard. Their discoveries are stunning, counter-intuitive and of no immediate practical consequence. They are therefore widely unknown outside their fraternity. A further reason for their obscurity is that the hard facts they have skilfully revealed appear to run exactly in the opposite direction to the requirements of those who write books about the mind. They have studied the sequence of events which link the eye to the brain. They have tapped into the mechanisms by which the light from objects in the world is related to sensations and perceptions and behaviour.
Sperry began to look at the development of this apparatus in a school dominated by Paul Weiss and a romantic view of nerve cells. It has been thought that nerve cells gently argued out their final destiny in the manner of a Vienna café klatsch. A Yankee from Connecticut, Sperry repeated a very simple experiment which had been done by Stone at Yale. He removed eyes from amphibia, rotated them by 180 degrees and replaced them. All holistic, co-operative, interactionist predictions would insist that these rotated eyes would adjust to their new environment. The fact is that the animals see upside down for the rest of their lives. To prove his point, Sperry and Miner took a patch of skin from tadpoles and replaced it so that the white belly skin grew up on the back. When these tadpoles metamorphosed to frogs, Sperry touched the skin on the back which months before had been on the belly, and the frogs scratched their bellies. Sperry and his collaborators extended these experiments to show in fine detail that nerve cells acquire a specificity at a very early stage in the embryo which determines their ability to connect to other parts of the brain which in turn permanently influence overall behaviour in response to sensory stimuli. This conclusion is so rigid, and the mechanism which would bring it about is so difficult to unravel, that the results remain under steady attack and yet have so far withstood this onslaught.
Hubel and Wiesel began to study the sequence of physiological events which follows the reception in the retina of light patterns, taking the initial work of their teacher Kuffler, who had shown how retinal cells react to light and send messages to the brain. Kuffler, born in Hungary, trained in Vienna, working mainly in the USA, was from 1940 to his death in 1980 the most innovative brain researcher. They were also influenced by a seminal paper, ‘What the frog’s eye tells the frog’s brain’, by Lettvin, Maturana, McCulloch and Pitts from MIT, who had shown that the frog’s eye was extracting features from its visual world and not simply reporting on a mosaic of black or white dots. Hubel and Wiesel showed that cats and monkeys had an incredibly detailed apparatus in the visual cortex to extract features from the visual world. These features – lines, lengths, orientations – were extracted and separated in discrete, definable, exactly repeatable areas of cerebral cortex. In monkeys, where the two eyes view the same scene, the details of the one world seen by the two separate eyes are brought together again onto the same cells in the cortex. Hubel and Wiesel looked at the development of these connections in the new-born animal and found that the pathways were laid down before the eyes were opened, without reference to the world – as Sperry had shown in frogs. However, in mammals, there was a short period of some weeks in which the connections were checked with reference to the real world. If some disaster occurred, such as closure of one eye, or the squint of one eye so that the expected convergence of the two eyes was incorrect, then one or the other message was rejected, after which the pattern settled down to its rigid, specific and fractional function.
It will be seen that these horrible facts are absolutely fundamentally unacceptable. They may win their authors a Nobel Prize, but those who discuss the mind know that something is seriously wrong. Sperry reappears on the scene to make matters worse. In a series of experiments, he examined the apparent mosaic representation of the world in a point-by-point way in the cortex. He knew that there were interconnections between the points and that these perhaps represented the basis of the predicted integration. Sperry simply diced the cortex by deep cuts and found no disturbance of form perception. He then proceeded to an investigation for which he is most famous, but which was a logical continuation of the conclusions of his previous experiments.
The most obvious connection in the brain is the corpus callosum which connects the two cerebral hemispheres. There are also less obvious interconnections between the two sides. Sperry and Meyer cut all the connections between the two sides of the forebrain and found that they had created animals which, in a crude sense, were two animals ignorant of each other’s detailed knowledge and even in conflict about meaning. Sperry, with Gazzaniga, studied humans who had had part of this cross-connection cut by neurosurgeons to control the spread of epilepsy, and found that they, too, showed signs of two independent brains in a single body, with ‘minds’ in conflict. Geschwind, now Professor of Neurology at Harvard, showed that this fractionation occurred quite often when parts of the brain were disconnected by disease.
All educated readers will know that these scientific discoveries of the workings of the brain must be nonsensical. They show a fractionation of function with a separate location for separated abilities. I have called this process ‘the new phrenology’. In the 19th century, different human qualities – love, hate, managerial competence – were assigned to different parts of the brain. These developed in different degrees in individuals and produced differential enlargements of parts of the brain which were mirrored in bumps on the skull. Now, a hundred and fifty years later, these damned scientists with their Nobel Prizes are telling us that the brain is specialised in a far more detailed way than was ever imagined.
Could there be a way out of this unpleasant dilemma, in which one set of facts found by scientists seem to conflict with another set of facts which seem totally apparent to humanists or other civilised people? There is a minor gleam of light. As human beings thinking about ourselves and looking at others, we seem to feel and behave in some integrated way with respect to the world around us, and yet these undoubted scientific discoveries seem to assure us that we are receiving the nature of the world in an isolated fractional fashion. Do we, in fact, sense and perceive the world as an independent series of messages pinned up on the noticeboard: ‘Injury to right big toe, line at 45° NNE 10 metres; injury to left foot, line at 160° SWS 10 metres ...’ – which means ‘Christ is crucified.’ That seems heresy in every sense of the word, and yet we must face the conflict. The word ‘integration’ sheds the gleam of light. Those of us who were not condemned to the far depths of the other culture learnt about differential calculus at school. Here one could learn how to analyse the trajectory of a point moving through space, a line. There were two ways to do this. The hard way was to analyse continuous uninterrupted movement. The easy way was to break the movement up into a series of steps. Then you made the steps smaller and smaller and approached the continuous. When computers were developed, there was a battle between analogue and digital computers. The digital process has won because it is easier to analyse admittedly continuous processes as though they were made up of a series of discrete little steps. The Columbia space shuttle displayed on the world map of Mission Control centre at Houston has its position changed every 20 seconds, even though the facts are that its position changes continuously. For NASA, 20-second digital sampling is adequate for its purpose. It could be that we can gain insight as to how the brain has solved its problems by examining how humans have solved theirs. This is the first point of Richard Gregory’s book.
The book begins with a section called ‘Forging science from myth’. Gregory is a psychologist and his best-known contributions relate to optical illusions. In this book he shows the tremendous breadth and depth of his knowledge. This is not just showing off the extent to which he is a clever, widely-read, well-educated man. He is attempting to know his intellectual fathers. This is surely a noble pursuit, and necessary if one proposes to develop an attitude to the nature of mind. He is patient and reverent with these forebears even when they are talking rubbish. His first major thesis is that from the beginning of history our way of thinking about ourselves was influenced and even formed by the tools and machinery which were being invented. He extends this view up to modern times and proposes that the invention of computers brings with it a new way of thinking about thinking. This hypothesis will, of course, be condemned by those who see themselves and their precious culture as formed only by the purest of thoughts totally uninfluenced by the existence of mechanics and their grubby mechanisms. The self-appointed cardinal, Arthur Koestler, ever vigilant for the appearance of mechanistic heresy, exploded with vitriol all over the pages of the Sunday Times when reviewing this book. There are those who would take such an attack as a compliment and a recommendation that the book should be read. Koestler and his clan are being arbitrary and inconsistent in condemning technology as an educational force while accepting art as a praiseworthy route to self-awareness. The first sculptor who moulded clay into a man instead of a pot was exploring the nature of man. Those cave painters smearing pigment on their walls were finding out something about themselves and their visual world, as well as the road which led to IG Farbenindustrie. There have been those even more conservative than Koestler who have correctly identified such activities as a dangerous intrusion by man into the realm of the gods. Moses, the great reactionary, rightly forbade the making of graven images. Whitehead wrote: ‘It is a moot point whether the human hand created the human brain or the brain created the hand. Certainly the connection is intimate and reciprocal.’
Pursuing his theme, Gregory interweaves a history of philosophy and of science and of technology. For example, man became aware of the orderly regular procession of the stars through space. All continuous movements known to man at the time required some continuous pulling, pushing and shoving. It was therefore reasonable to require a great timekeeper in the sky who would move and regulate. Gregory writes: ‘It was the development of nearly frictionless clockwork in the 16th and 17th centuries that may have led to the concept of self-maintaining mechanisms having no frictional losses. This, at one stroke, removed the need or reason for a “Machine-minder” for the heavens.’ Such abrupt conclusions are repeated throughout the book once the history has been carefully laid out to achieve his ends. The ends themselves are certain to startle and to infuriate some people as they see their favourite philosophers or ideas placed in a new or surrealist context. Such sentences as ‘So Newton ended up thinking that God runs the show by permeating not matter but space’ are not calculated to endear Gregory to more formal historians of science.
Because of the huge scope of subjects interwoven, the book is difficult to read and judge. I was frequently completely out of my depth, having to trust the author to tow me on a life-raft through vortices of foreign seas. Even when one emerged into calmer waters, where familiarity allowed more thoughtful progress, one was likely to be shaken by sudden eruptions of the author’s wit. In a section on entropy, the following sentence appears: ‘Chickens can unscramble omelettes by eating them.’ Full stop, and on to the next subject. This sort of sentence is good fun if you get the point, but even if you do, it does not make for easy reading.
The book is divided into six parts. The first deals with the emergence of science from myth. Gregory goes on to discuss the ways in which the world links to the mind and how these links have to be links within the mind. Next, consciousness and mind-brain links are explored. Finally he must discuss the nature of self and of knowledge. A key theme and conclusion in all this is that perceptions should be considered as hypotheses. This is an important and subtle view which he has been developing for some time. It proposes that perceptions are essentially like predictive hypotheses in science. He thinks that he is not discussing an analogy but a deep identity. In other words, the process of forming a scientific hypothesis is the same as the process of forming a perception. To supply evidence for this, he returns to his favourite theme of perceptual illusions, which he proposes correspond to systematic errors occurring in science through loss of calibration or through misplaced assumptions or knowledge.
His scope is huge, and there are thin patches. For example, the role of society is mentioned – to be dismissed in a quotation from Frankfurt: ‘The ancients, like the modern savages, saw man always as part of society.’ The role of exploration and development and behaviour as a possible key element in perception is hardly mentioned. All those new machines, with their wonderful emergent properties, always seem to be made and manipulated by someone else. Gibson, who emphasised action as part of perception, is given a very rough ride in a book remarkable for the author’s gentle handling of others’ views. The author presents himself as one who receives images and mentally manipulates them: he does not himself move back into the world as part of the validation or invention of hypotheses.
One must also ask how powerful are the conclusions and the new insights to be gained from artificial intelligence generated by computers. The author has the honesty and humility to begin his last chapter with a quotation from Boscoe Pertwee: ‘I used to be indecisive, but now I’m not so sure.’ Samuel Johnson warned people not to make jokes at their own expense because others will use them seriously. I am impressed and excited by the great sweep of development over the centuries so far as the understanding of mind in science is concerned: but thoroughly unimpressed by the outcome of the latest episode of this search, where machines have been set the task of thinking. I did not find in this book a way out of the paradox whereby the mind integrates so beautifully over space and time with an apparatus which has been shown by this year’s Nobel laureates to disintegrate the sensory world into spatially separate, specialised multiple compartments.