In the latest issue:

Real Men Go to Tehran

Adam Shatz

What Trump doesn’t know about Iran

Patrick Cockburn

Kaiser Karl V

Thomas Penn

The Hostile Environment

Catherine Hall

Social Mobilities

Adam Swift

Short Cuts: So much for England

Tariq Ali

What the jihadis left behind

Nelly Lahoud

Ray Strachey

Francesca Wade

C.J. Sansom

Malcolm Gaskill

At the British Museum: ‘Troy: Myth and Reality’

James Davidson

Poem: ‘The Lion Tree’

Jamie McKendrick


Jenny Turner

Boys in Motion

Nicholas Penny

‘Trick Mirror’

Lauren Oyler

Diary: What really happened in Yancheng?

Long Ling

I am not a computerOwen Flanagan

Terms and Conditions

These terms and conditions of use refer to the London Review of Books and the London Review Bookshop website ( — hereafter ‘LRB Website’). These terms and conditions apply to all users of the LRB Website ("you"), including individual subscribers to the print edition of the LRB who wish to take advantage of our free 'subscriber only' access to archived material ("individual users") and users who are authorised to access the LRB Website by subscribing institutions ("institutional users").

Each time you use the LRB Website you signify your acceptance of these terms and conditions. If you do not agree, or are not comfortable with any part of this document, your only remedy is not to use the LRB Website.

  1. By registering for access to the LRB Website and/or entering the LRB Website by whatever route of access, you agree to be bound by the terms and conditions currently prevailing.
  2. The London Review of Books ("LRB") reserves the right to change these terms and conditions at any time and you should check for any alterations regularly. Continued usage of the LRB Website subsequent to a change in the terms and conditions constitutes acceptance of the current terms and conditions.
  3. The terms and conditions of any subscription agreements which educational and other institutions have entered into with the LRB apply in addition to these terms and conditions.
  4. You undertake to indemnify the LRB fully for all losses damages and costs incurred as a result of your breaching these terms and conditions.
  5. The information you supply on registration to the LRB Website shall be accurate and complete. You will notify the LRB promptly of any changes of relevant details by emailing the registrar. You will not assist a non-registered person to gain access to the LRB Website by supplying them with your password. In the event that the LRB considers that you have breached the requirements governing registration, that you are in breach of these terms and conditions or that your or your institution's subscription to the LRB lapses, your registration to the LRB Website will be terminated.
  6. Each individual subscriber to the LRB (whether a person or organisation) is entitled to the registration of one person to use the 'subscriber only' content on the web site. This user is an 'individual user'.
  7. The London Review of Books operates a ‘no questions asked’ cancellation policy in accordance with UK legislation. Please contact us to cancel your subscription and receive a full refund for the cost of all unposted issues.
  8. Use of the 'subscriber only' content on the LRB Website is strictly for the personal use of each individual user who may read the content on the screen, download, store or print single copies for their own personal private non-commercial use only, and is not to be made available to or used by any other person for any purpose.
  9. Each institution which subscribes to the LRB is entitled to grant access to persons to register on and use the 'subscriber only' content on the web site under the terms and conditions of its subscription agreement with the LRB. These users are 'institutional users'.
  10. Each institutional user of the LRB may access and search the LRB database and view its entire contents, and may also reproduce insubstantial extracts from individual articles or other works in the database to which their institution's subscription provides access, including in academic assignments and theses, online and/or in print. All quotations must be credited to the author and the LRB. Institutional users are not permitted to reproduce any entire article or other work, or to make any commercial use of any LRB material (including sale, licensing or publication) without the LRB's prior written permission. Institutions may notify institutional users of any additional or different conditions of use which they have agreed with the LRB.
  11. Users may use any one computer to access the LRB web site 'subscriber only' content at any time, so long as that connection does not allow any other computer, networked or otherwise connected, to access 'subscriber only' content.
  12. The LRB Website and its contents are protected by copyright and other intellectual property rights. You acknowledge that all intellectual property rights including copyright in the LRB Website and its contents belong to or have been licensed to the LRB or are otherwise used by the LRB as permitted by applicable law.
  13. All intellectual property rights in articles, reviews and essays originally published in the print edition of the LRB and subsequently included on the LRB Website belong to or have been licensed to the LRB. This material is made available to you for use as set out in paragraph 8 (if you are an individual user) or paragraph 10 (if you are an institutional user) only. Save for such permitted use, you may not download, store, disseminate, republish, post, reproduce, translate or adapt such material in whole or in part in any form without the prior written permission of the LRB. To obtain such permission and the terms and conditions applying, contact the Rights and Permissions department.
  14. All intellectual property rights in images on the LRB Website are owned by the LRB except where another copyright holder is specifically attributed or credited. Save for such material taken for permitted use set out above, you may not download, store, disseminate, republish, post, reproduce, translate or adapt LRB’s images in whole or in part in any form without the prior written permission of the LRB. To obtain such permission and the terms and conditions applying, contact the Rights and Permissions department. Where another copyright holder is specifically attributed or credited you may not download, store, disseminate, republish, reproduce or translate such images in whole or in part in any form without the prior written permission of the copyright holder. The LRB will not undertake to supply contact details of any attributed or credited copyright holder.
  15. The LRB Website is provided on an 'as is' basis and the LRB gives no warranty that the LRB Website will be accessible by any particular browser, operating system or device.
  16. The LRB makes no express or implied representation and gives no warranty of any kind in relation to any content available on the LRB Website including as to the accuracy or reliability of any information either in its articles, essays and reviews or in the letters printed in its letter page or material supplied by third parties. The LRB excludes to the fullest extent permitted by law all liability of any kind (including liability for any losses, damages or costs) arising from the publication of any materials on the LRB Website or incurred as a consequence of using or relying on such materials.
  17. The LRB excludes to the fullest extent permitted by law all liability of any kind (including liability for any losses, damages or costs) for any legal or other consequences (including infringement of third party rights) of any links made to the LRB Website.
  18. The LRB is not responsible for the content of any material you encounter after leaving the LRB Website site via a link in it or otherwise. The LRB gives no warranty as to the accuracy or reliability of any such material and to the fullest extent permitted by law excludes all liability that may arise in respect of or as a consequence of using or relying on such material.
  19. This site may be used only for lawful purposes and in a manner which does not infringe the rights of, or restrict the use and enjoyment of the site by, any third party. In the event of a chat room, message board, forum and/or news group being set up on the LRB Website, the LRB will not undertake to monitor any material supplied and will give no warranty as to its accuracy, reliability, originality or decency. By posting any material you agree that you are solely responsible for ensuring that it is accurate and not obscene, defamatory, plagiarised or in breach of copyright, confidentiality or any other right of any person, and you undertake to indemnify the LRB against all claims, losses, damages and costs incurred in consequence of your posting of such material. The LRB will reserve the right to remove any such material posted at any time and without notice or explanation. The LRB will reserve the right to disclose the provenance of such material, republish it in any form it deems fit or edit or censor it. The LRB will reserve the right to terminate the registration of any person it considers to abuse access to any chat room, message board, forum or news group provided by the LRB.
  20. Any e-mail services supplied via the LRB Website are subject to these terms and conditions.
  21. You will not knowingly transmit any virus, malware, trojan or other harmful matter to the LRB Website. The LRB gives no warranty that the LRB Website is free from contaminating matter, viruses or other malicious software and to the fullest extent permitted by law disclaims all liability of any kind including liability for any damages, losses or costs resulting from damage to your computer or other property arising from access to the LRB Website, use of it or downloading material from it.
  22. The LRB does not warrant that the use of the LRB Website will be uninterrupted, and disclaims all liability to the fullest extent permitted by law for any damages, losses or costs incurred as a result of access to the LRB Website being interrupted, modified or discontinued.
  23. The LRB Website contains advertisements and promotional links to websites and other resources operated by third parties. While we would never knowingly link to a site which we believed to be trading in bad faith, the LRB makes no express or implied representations or warranties of any kind in respect of any third party websites or resources or their contents, and we take no responsibility for the content, privacy practices, goods or services offered by these websites and resources. The LRB excludes to the fullest extent permitted by law all liability for any damages or losses arising from access to such websites and resources. Any transaction effected with such a third party contacted via the LRB Website are subject to the terms and conditions imposed by the third party involved and the LRB accepts no responsibility or liability resulting from such transactions.
  24. The LRB disclaims liability to the fullest extent permitted by law for any damages, losses or costs incurred for unauthorised access or alterations of transmissions or data by third parties as consequence of visit to the LRB Website.
  25. While 'subscriber only' content on the LRB Website is currently provided free to subscribers to the print edition of the LRB, the LRB reserves the right to impose a charge for access to some or all areas of the LRB Website without notice.
  26. These terms and conditions are governed by and will be interpreted in accordance with English law and any disputes relating to these terms and conditions will be subject to the non-exclusive jurisdiction of the courts of England and Wales.
  27. The various provisions of these terms and conditions are severable and if any provision is held to be invalid or unenforceable by any court of competent jurisdiction then such invalidity or unenforceability shall not affect the remaining provisions.
  28. If these terms and conditions are not accepted in full, use of the LRB Website must be terminated immediately.
Shadows of the Mind: A Search for the Missing Science of Consciousness 
by Roger Penrose.
Vintage, 457 pp., £17.99, September 1995, 0 09 958211 2
Show More
Show More

Years ago, a colleague of limited intellectual powers accosted me with the charge that I had been telling students that the ‘mind was meat’. This was my colleague’s way of putting things. I then fell for the question which the charge led up to: ‘So you’re a materialist?’ ‘Yes,’ I answered. To which my normally witless interlocutor responded: ‘Pray tell, what is the nature of the material world?’

Witless was right. We naturalists don’t remotely understand what makes up the world, let alone the world itself. Our materialism is promissory. If the world can be explained, it can be done without introducing non-natural or supernatural properties, phenomena or processes. But what principles demarcate natural properties, phenomena and processes from those too weird to be natural is not something that is much discussed, let alone agreed on. Whoever calls for a ‘deeper physicalism’, whether it be Witless or a leading mathematical physicist like Roger Penrose has got to be right. Deep physicalists look to biology, chemistry and physics for depth. They are sceptics – and rightly – about the resources of both a priori philosophy of mind and computer science to explain nature and mind.

Computationalism continues to grip the academic and popular imaginations, however. A recent issue of the New York Times led its science section with an article called ‘The Brain Manages Happiness and Sadness in Different Centres.’ Immediately beneath the first paragraphs of text were pictures of PET scans showing the different areas of the brain that are active when people are happy or sad. The pictures were introduced in large, boldface type with this headline: ‘How the Brain Computes Tears and Laughter.’

This might seem innocuous, but if so, it can only be because talk of the brain computing things is the dominant way of speaking about the mind. But in this case, at least, the pictures decidedly did not show the brain computing anything, let alone tears and laughter. What PET scans show are blood-flow patterns. So what the pictures showed were areas where blood flows; and thus oxygen, and thus, we presume, activity, were increased or decreased. The maps were correlated with phenomenological reports of the relevant feelings, and the plausible inference that such and such areas of the brain are importantly implicated in the relevant emotions is made. Nothing was revealed, however, about how anything was being computed.

Why do we speak this way? The simple answer is because the computer is the dominant metaphor for mind. For Penrose it is a metaphor which has overstepped its bounds. We forget that it is a metaphor, and take the claim literally, so that the mind/brain is a computer. Colin McGinn calls this view ‘pan-computationalism’. In its most extreme version the universe becomes a computer, whose parts run their own particular programs or sub-routines.

Penrose defines computation as the action of a Turing machine, a perfectly idealised computer, and takes ‘algorithm’ to be completely synonymous with ‘computation’. The appeal of ‘pan-computationalism’ comes both from the success of computer modelling across disciplines and from the construction of machines which display ‘intelligence’. But just because we can model chemical reactions, photosynthesis, genetic transmissions, the weather and so on, computationally, it seems absurd to say that any of these things are computers or the computational children of the mother of all computers, the universe.

Penrose has, I think, just the right targets here: overconfidence that our current explanatory apparatus can give an account of reality – hence the need for a deeper physicalism; and a metaphorical model that is taken literally and thereby obstructs progress in the science of the mind. Model, metaphor and simile are relations of likeness, not of identity. The mind may function like a computer, but so do I function like other land-based mammals. But I am not a computer, nor am I a chimp, an orang-utan, or a deer.

Penrose rejects both the idea that the brain is a computer and the idea that it can be adequately modelled computationally. Why? Because certain things we humans do we do non-computationally. Since the brain is not a computer, its causal powers cannot be duplicated by one. So much for the program of strong Artificial Intelligence. And since the brain cannot be adequately modelled computationally, so much for the program of weak Artificial Intelligence. Computers will never pass the Turing test because we cannot even in theory build one that will simulate or mimic the input-output relations normally mediated by the human brain.

What is it that humans can do that computers can’t, and why think that ‘some, at least, of conscious activity, must be non-computational’ (his italics)? Penrose’s answer is simple and direct. Mathematicians can ‘see’ the truth of certain mathematical statements that we know, thanks to Gödel and Turing, cannot be proven by procedures known to be sound. This shows that our awareness of these truths is ‘demonstrably noncomputational’.

Penrose here responds to twenty earlier criticisms of his initial argument. But I still don’t see that his arguments succeed. The reasons are pretty straightforward. Take first the limitative results of Turing, establishing that we cannot guarantee that for every problem within arithmetic we can ascertain whether the program dedicated to its solution will come to a halt and tell us what the solution is, or even whether the problem has a solution. Imagine a primitive program designed to do whatever it is told – call it ‘Dopey’. Imagine we tell Dopey to find an odd number that is the sum of two even numbers. Now suppose that mathematicians just ‘see’ that this problem is a non-starter – there simply is no odd number that is the sum of two even numbers. Dopey, however, keeps searching for all eternity. What does this show? It establishes that when we ‘see’ there is no such number, we are not simply using whatever algorithm Dopey uses. Does this show we’re not using an algorithm? No. Penrose allows that connectionist systems, which learn as they go, are algorithmic. So imagine that we are not Dopey but a system like Dopey that will do what it is told, but only up to a point. Once we have tried the best known solution strategy, failed to find a solution and discerned a pattern of failure, we stabilise into a state of ‘high confidence’ that the problem is a non-starter. The most this example shows is that the human who discerns the relevant truth is not computationally equivalent, as far as the solution to this problem goes, to Dopey.

The same point applies in Gödel’s case. Gödel tells us that for any formal system as complex as simple (Peano) arithmetic (P), if that system is consistent, there will be a true sentence that expresses P’s consistency which the system cannot prove. Suppose world-class human mathematicians agree that arithmetic is consistent; even suppose they ‘know’ this and are right. Does this show that ‘human mathematicians are not using a knowably sound algorithm in order to ascertain mathematical truth’? I think not. What it shows is that if the human is using a formal system to ‘see’ that P is consistent he is not using P. But Gödel’s theorem, as developed by Gentzen, shows that the consistency of P can be proved using axioms different from P; call them P1. So one possibility is that the mathematical insight displayed in the first case is due to the fact that we are not operating with a Dopey equivalent, and in the second case that we are not operating with P, but that in both cases we are operating with some formal system or other, with some Turing-computable apparatus.

Penrose opens himself up to this sort of response by his own expansive notion of computability. As long as you can get a computer to do it, it is computable. But surely you can program a computer to make guesses about a pattern, including the consistency of its own productions, based on assessment of its accrued knowledge base. Correct guessing in such cases would be computable because a computer does it, but it would not be computable in the sense that what the computer or person asserts with confidence, sees, or knows, is deductively derived from the axioms and transformation rules. Penrose never eliminates either the possibility that what mathematicians see, they see because they are in fact using some formal system with which they unconsciously prove what they see, or the possibility that they run a program whose heuristic principles cause them to assert or see things a certain way. Computational systems include ‘top-down’ systems that come with a pre-specified program and a store of knowledge, and ‘bottom-up’ systems like the artificial neural networks that learn from experience.

Penrose has another argumentative ploy up his sleeve. Mathematicians don’t think they are following any type of unconscious algorithms when they discern mathematical truth. Either they consciously use computable principles or they just consciously see the truth of those statements that are (or might be) noncomputable in some formal system or other. Penrose admits that mathematicians might be wrong in thinking that the insight they display is noncomputable insight pure and simple, but I don’t see that he takes this possibility seriously enough. Mathematical reality exists independently of us and mathematicians are sometimes able to detect it by noncomputable insight or intuition. Penrose’s Platonism runs deep, too deep.

His ultimate target is consciousness. What mathematicians consciously understand is key to the entire argument. It is a truism of contemporary mind science, however, that only certain mental states are conscious, and that more often than not what we are conscious of is product, not process. Indeed, almost all progress in the science of the mind to date has resulted from abandoning the assumption that people are capable of giving reliable reports on, let alone knowing, how the mind works.

Penrose lacks the warrant to say that humans demonstrably use noncomputational procedures in ascertaining mathematical truth, and without this conclusion in place the rest of his project, which involves speculation about brain structures that might support noncomputable thought, is unmotivated. Penrose’s tactic is first to secure the conclusion that we humans use noncomputable techniques or skills, then to locate brain structures that could do the job. But we don’t need what we don’t need. And something not worth doing is not worth doing well.

Still Penrose persists. Where might we find noncomputable processes? Quantum physics is queer enough to have promise. The trouble is that Penrose thinks that quantum physics, as we now know it, is computable. So we will need some new noncomputable quantum phenomena. Where might we find these? Certainly not in any structure as large as a neuron. He speculates that quantum gravitational effects – quantum gravity is not known to exist anywhere in the universe – might exist in the very tiny structures called microtubules inside the skeletal structures that hold all cells together – the non-neuronal cells of single-cell paramecia as well as the neurons of mammalian brains.

The next move, and it is key, is to claim that there is direct evidence that consciousness is related to activity in the microtubules of the cytoskeleton of neurons. This evidence comes from the fact that all general anaesthetics, their different chemical structures notwithstanding, interrupt action in the microtubules and render patients unconscious. Noncomputable quantum gravity to one side, this is not a very good argument for attributing a significant causal role in generating consciousness to the microtubules. First, the argument is hardly direct. Remove a spark plug and my car won’t start. Spark plugs are causally relevant, even necessary for a car to start, but they are not sufficient; they are one among a multitude of things that need to be working properly for my car to start. Second, it is not at all clear that people under general anaesthesia are unconscious in the relevant sense. We say of the person clubbed over the head that he was rendered unconscious; and we say of people in deep sleep that they are unconscious; and we speak of people under general anaesthesia as unconscious. But saying these things doesn’t make them true.

The fellow knocked unconscious might claim to have seen stars and heard voices and had strange dreams. And it is true, but not widely known, that people are mentating in non-REM (Rapid Eye Movement) sleep as well as in REM sleep. Finally, I have attended conferences, including a splendid one in Tucson, Arizona last spring organised by Stuart Hameroff, Mr Microtubules himself, in which anaesthesiologists worried aloud about whether their patients are conscious while anaesthetised. Despite the fact that patients don’t seem to be experiencing pain, and despite the fact that under general anaesthesia, more so even than in normal sleep, memory is lousy, there are reports of dreamlike states, and even of memories of occurrences in operating rooms. The point is that if patients under general anaesthesia are unconscious in the sense that they can’t experience pain, this hardly establishes that they are not having experiences of other sorts, even if these are difficult to remember. Penrose admits that quantum gravitational effects in the microtubules are not sufficient when he acknowledges that nothing he says requires that we attribute consciousness to other animals: ‘there may well be a great deal, in addition to properly functioning cytoskeletons, that is needed to evoke a conscious state.’ Indeed, there must be more to consciousness than properly functioning cytoskeletons since zombie-like paramecia, too, have these.

In the end, Penrose’s deep Platonism leads to a physicalism that is, as I’ve said, too deep. In the excited rush to solve the problem of consciousness Penrose passes over vast and promising terrain. It is called ‘the brain’. The right sort of deep physicalism involves paying attention to the whole brain, not just to neural connectivity but to connectivity plus the exciting new discoveries being made each day about the wash of neurochemicals that are implicated in different states of mind and consciousness.

The main problem with the computational model of the mind is that it makes doing neuroscience seem less important than getting at the abstract programs that allegedly support various aspects of mentality. Ironically, Penrose’s excessive enthusiasm about quantum level phenomena lands him in the same place. It makes solving the whole mind/brain problem seem less important than getting at noncomputable quantum effects in tiny portions of brain tissue, effects that we have no overwhelming reason to think exist, and which, even if they do, shed no light whatsoever on why some systems are conscious and some not.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

Please include name, address, and a telephone number.

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.