Earlier this year, in a London hospital on a dark afternoon at the end of winter, a neurosurgeon asked me to spell ‘world’ backwards. Behind him, an image of my skull floated on a monitor. On one side of it, there was a milk-white gob of tumour. It looked about as big as a golf ball. He wanted me to spell the word backwards because I am left-handed, and because I have a tumour lodged in my right temporal lobe. ‘Are you right-handed or left-handed?’ is often the first question a neurologist will ask. From the answer, it is possible to get an idea of how someone’s brain is organised – in particular, which hemisphere is dominant in certain aspects of language processing. Brain function is cross-wired: if I wiggle my left foot, the instructions are issued from the right side of my brain. The same is true if I wink shutting my left eye – what I then see through my open right eye is processed by the left side of my brain.
In most brains, language function is also lateralised. For around 95 per cent of right-handers and 60 per cent of left-handers, the neurological hardware required for language cognition is located in the left hemisphere in the regions known as Wernicke’s Area, at the rear of the left temporal lobe, and Broca’s area, a little further forward in the left frontal lobe. If either is damaged, the implications for language expression and cognition can be serious and a degree of aphasia can result. In the most severe instances, words simply become meaningless. Those affected cannot read, write or talk coherently and often compensate by inventing elaborate neologisms and making educated guesses based on partial memories from a time when words had meaning (saying ‘chair’ for ‘table’ or ‘spoon’ for ‘fork’).
If the right hemisphere is damaged, language cognition tends to be affected in a different way. The right hemisphere is responsible for decoding the non-literal aspects of language, making it possible for us to understand irony and humour, sarcasm and metaphor. Our ability to understand the context of a sentence and to pick up on tone and other non-verbal cues also depends on the right hemisphere. Here, puns click into place, punchlines become funny and the morals of a story are drawn. If this hemisphere is damaged, it is often still possible for those affected to understand formal language when correctly used – but meaning implied in tone, for instance, is lost. Slang, informal or emotional speech becomes incomprehensible. Even the deadest of metaphors becomes unfathomable: it is a nonsense to sit at the foot of a bed or to thread the eye of a needle.
In 1985, Oliver Sacks investigated the response of patients with left-hemisphere and right-hemisphere damage to a speech given by Ronald Reagan. Most of the aphasic patients burst out laughing. Their left hemispheres were damaged; for them, Reagan’s words collapsed in a meaningless heap. What made them laugh? Sacks says that as a way of compensating for their inability to understand words, aphasics are often hypersensitive to non-verbal cues: tone of voice, emphasis, facial expression, gesture and so on. Based on his non-verbal cues – on the way he delivered his speech – Reagan created an amusing impression. On the same ward there was a patient who, like me, had a tumour in her right temporal lobe. Emily D. had been an English teacher and a poet. She had perfect command of formal language, but could no longer grasp the meaning inherent in tone and other non-verbal signs. She didn’t laugh at the president, but she didn’t understand him either. ‘He is not cogent,’ she said. ‘He does not speak good prose. His word use is improper. Either he is brain-damaged, or he has something to conceal.’
I failed to spell ‘world’ backwards. I could read the neurosurgeon’s gestures and tone well enough. He’d cooed a reassuring ‘Good’ each time I completed a task. I had squeezed this hand, then the other hand; I had followed his finger from one side to the other with my eyes; I had correctly called out, eyes closed, the letters he traced ever so lightly on the backs of my hands. But when I sounded the letters ‘D, L, O, R, W’ he sat up slightly, and instead of the ‘Good’ I had been expecting there was silence. He turned back to his monitor to consult the scans of my brain and zoomed in and out a couple of times. The tumour grew and shrank. ‘How left-handed are you? Do you brush your teeth with your left hand?’ he asked. I do brush my teeth with my left hand. I also throw, chop, hammer, wipe down surfaces and hold pool cues with my left hand. I thought about it: ‘I can only wink with my left eye, and I kick with my left foot – and come to think of it, my left foot is slightly larger than my right and my left eye slightly stronger than my right eye.’
It is possible that I misspelled ‘world’ backwards because of the position of the tumour in relation to the areas of my brain that are responsible for language cognition (so-called ‘eloquent areas’). In that case, I would be in the small minority of people whose language function is located more or less entirely in the right hemisphere. But it is also possible that my mistake was less significant: I had been caught off guard by the spelling question, expecting to be asked to puff out my cheeks or wiggle my eyebrows. The tumour may or may not have found its way into functionally useful tissue. In order to decide how best to treat it, the doctors would have to find out why I’d got the spelling question wrong.
Until the 1960s, ‘Are you left-handed or right-handed?’ was the surest way to get some idea of which hemisphere was dominant for language. Now there are other, more sophisticated ways to find out. The neurosurgeon recommended that I had my brain mapped using Transcranial Magnetic Stimulation (TMS) and functional Magnetic Resonance Imaging (fMRI). He explained that if the tumour was located in or encroaching on eloquent areas of the brain, it would be safest to operate while I was awake. Once my skull had been sawed open in a neat rectangle above my right ear, I would be dredged up from the depths of a general anaesthetic for a chat. I would guide the surgeon’s knife by my answers to simple questions (in fact, neurosurgeons tend to use a small vacuum-like instrument to slurp up unwelcome tissue). If my speech began to falter as the surgeon approached certain areas, he or she would know to retreat.
Some weeks later I went to meet another neurosurgeon, who would map my brain using TMS. He’d given me a time and a place, along with careful directions: 9 a.m. in the waiting room at the end of a long, pale blue corridor on the left as you exit the lifts on the second floor. Since the spelling incident, I had felt a gulp of anxiety each time I scuffed or misplaced a word, worried that my neurological wiring was beginning to break down under the pressure of the tumour. Now I killed time in the waiting room spelling backwards the words I could see on magazine covers.
The neurosurgeon led me into a warren of quiet corridors. His suit swished lightly. He had a swagger and an easy, louche demeanour at odds with the sterile hallways and the early hour. He seemed to belong in a casino rather than a hospital. I picked up on the same steeliness I had noticed in other surgeons – as if a jaw were being constantly clenched. He ushered me into a side-room, and invited me to take a chair in the middle of the room. It looked like the kind in which teeth are drilled and pulled. Before long a second man entered the room. He introduced himself – he was a neuroscientist – and asked whether I was right-handed or left-handed. The pair began to prepare equipment and boot up computers, trading a kind of med-school banter specked with neurological jargon.
Besides speech-mapping, TMS is used for diagnostic purposes and in treating depression. The neuroscientist was proud of the machine. ‘There are only two in the country,’ he told me. ‘Unlike proper countries, like America and Germany.’ He made a few adjustments, and began a brief explanation of how TMS works. The surgeon had taken a seat, and was blowing softly on his coffee. The neuroscientist held a device which he explained was a magnetic coil that would be used to deliver a small electric charge to a particular area of my brain, temporarily stunning it (or ‘tickling’ it, as he put it). In effect, the magnetic charge triggers a seizure in a few cells of the brain and briefly knocks them out. ‘It is not pin-point precise,’ he said, ‘but directing charges at either side of the brain should be enough to give an idea of which hemisphere is dominant for language.’ If I found myself unable to perform a certain task as one area of the brain was ‘tickled’, chances were that he had hit the general area responsible for performing that task. I didn’t much like the sound of a wayward magnet setting off micro-seizures in my brain, or the zeal with which I suspected it would be applied.
The neuroscientist held the coil over the left side of my head and fired: my right hand shot an inch or two into the air. He explained that he was trying to work out the right strength of charge. He toyed with my right hand until it was twitching and trembling to his satisfaction. I watched the hand curiously: a familiar, dependable instrument, the physical extension of my own will, but now the creature of a tall man holding a magnet. To set up a control, I had first to perform a task free from magnetic interference. Then I repeated the task while my brain was tickled. On a screen in front of me crude but clear line-drawn images of objects appeared, in a fairly rapid series. I had to name each object: ‘Boat. House. Needle. Banana. Clock.’ The images sped up, and soon I was getting near tongue-twister speed. I named each image successfully and the two men seemed satisfied. ‘You’re the only person to get “artichoke” first time,’ the surgeon said.
Now the neuroscientist readied his magnet again. Images flashed past, and he guided the coil around the right-hand side of my skull. He and the surgeon called out the names of regions of the brain as the magnet targeted them. Often the charge set off tremors down one side of my face. If my cheek or jaw started to go, the word for the image came out garbled and unclear. Once or twice the word I wanted disappeared. I reached for it but came back empty-handed. I felt it there, a lexical phantom. I was aphasic for an instant: a glimpse, perhaps, of my future. Moments later, the magnetic charge was redirected and the next word would be clear enough: ‘Fork. Piano. Elephant.’ It was time to switch sides. I drank some water and rubbed the left-hand side of my face. The scientist aimed the magnet at the left hemisphere. I was nervous: I was hoping for a significant impact on my language function this time round. That would suggest the tumour was far from the eloquent areas of my brain, safely tucked away on the other side of the corpus callosum – the great nerve tract which connects the two cerebral hemispheres. Instead it was much the same. I named most of the images successfully, and lost one or two. This time, the right-hand side of my face twitched.
It turned out that the entire exercise had been filmed. The surgeon pointed to a small camera in the plastic casing of the screen, on which my own face now appeared, zoomed in on tight at an unflattering angle. I watched as I twitched and trembled. Each time I garbled a word, or delayed or failed to answer, the neuroscientist made a mark on his clipboard. Often it was unclear. ‘Was that delay? Or just because of the twitching? What do you think, Harry?’ The three of us sat there, flicking back and forth between the tests. It seemed that the TMS had revealed little, except that there were areas of language cognition in both hemispheres. The atmosphere in the room felt flat. The surgeon downloaded the files and the footage onto a USB stick, and told me to pass it on to my consultant surgeon. He didn’t seem to trust the NHS computer systems to transfer the data any more than he’d trusted them to make an appointment.
A few days later I was in another hospital. Outside an MRI room, a sign in capital letters warned me that an extremely powerful magnet was running behind the door 24 hours a day, 365 days a year. I confirmed to the radiologist that my boots didn’t have steel toe-caps. I went through the airport security routine: I took off my belt and watch and patted down my pockets for change, checking for stray metal and feeling vaguely guilty. What happens if a forgotten two pence piece finds its way into an MRI scanner?
In an fMRI scan, magnetic imaging is used to reveal which areas of the brain are active during a specific task. Neurons demand more oxygen in order to perform a task, and the increased bloodflow lights up in bright colours on the scans, pinpointing neurological action to within millimetres. I have read that fMRI can detect lies. To tell a convincing lie, you have to be on top of a lot of things at once. Is it believable or consistent with what my interlocutor already knows? Does it fit into a wider story? Does it entail further lies? In most people this results in cognitive overload (the neurological wiring of psychopaths makes them exceptions and formidable liars). It has been suggested that this overload shows up in fMRI: the prefrontal cortex – a kind of Machiavellian command centre concerned with conflict, risk, error detection and administrative control – often lights up when we lie. Involuntary movements can also suggest cognitive overload. I know to doubt someone who touches their nose a lot as they talk.
I lay down on a hard plastic bench and was loaded into the long, narrow gullet of the scanner. I was to spend the next hour performing simple linguistic tasks inside the machine. Set a verbal chore of some kind, the eloquent areas of my brain would demand supplies of oxygen and glucose. The resulting increase in bloodflow would flash up on the scans, revealing the eloquent areas of my brain. I noted the funereal undertones of the set-up: I lay ramrod-straight, hands folded neatly across my chest and stared straight ahead. If this machine could tell when I was lying or depressed, what other well-buried information might it find out? It was cold in the scanner. Some of the radiologists had woolly hats on; I shivered in a long-sleeved T-shirt. Someone had given me a panic button. It felt round, tactile and tempting.
I had been given ear plugs to muffle the beeping and whirring and grinding of the machine, but they fell out immediately – their yellow ergonomic foam expanded uselessly on the bench beside my head. I twitched for the panic button. The scanner purred, breaking out now and then in a sustained bombardment of sound. The noise – a metal rainforest, full of clicks and bangs and whines – can reach 125 decibels, as loud as the front row at an Iron Maiden concert. It’s created as current is applied to the magnetic coil, causing it to expand and contract rapidly.
I lay cocooned in the antiseptic plastic for some time before the linguistic exercises began. A mirror of some kind was slotted into place and a screen appeared. I viewed it through a periscope-like contraption. It seemed to be in another room, and gave me a kind of vertigo to look at. A voice crackled over the intercom and explained the first task: the screen would flash up a letter for a number of seconds, and I had to think of words beginning with that letter. The first letter was ‘C’. Swatting away the swearwords that crowded to the front of my mind, I tried to imagine the exercise as a psychological assessment – a word association test that would prove deeply and embarrassingly revealing. I could see the woolly-hatted radiologists chuckling and shaking their heads, bent over computers and clipboards in the next room. I managed to send my thoughts in the direction of harmless ‘C’ words: cacti and caterpillars and cauliflowers formed a bucolic scene in my mind. It took an almost physical effort. I found it slightly easier if I composed a picture: a canary burst from a canopy of coniferous trees, and twittering to its companion took off into the clouds. I worried that the struggle I found myself engaged in (another revealing kind of cognitive overload?) might sabotage the exercise. Eventually a string of less common and less swearword-friendly consonants flashed up. I was sweating in spite of the cold.
It ended after about a third of the alphabet had been cycled through. I was given a short break between exercises; it felt eerily quiet as the machine fell dormant. Soon a voice over the intercom explained the second exercise: an image of a noun would appear on the screen, and I had to think of a corresponding verb. I could foresee problems here too, but the images were straightforward and flashed past at speed (there is a limit to what you can reasonably and rapidly think to do with a football, a spade or a shoe). Half an hour or so later I was spat out of the scanner, frozen and ears ringing. It felt as if someone had ransacked the inside of my head and left the windows open. I climbed down from the bench and unsteadily crossed the room, disorientated and sea-legged. I put my belt and watch back on, and scooped up the handful of change.
I saw the fMRI scans the following week. I sat in the same chair in which I had misspelled ‘world’ backwards a month earlier, facing the neurosurgeon who had asked the question. As before, ghost-like images of my skull hovered on computer monitors – cheered up, this time, by spots of blue and red. I looked at them: here were the eloquent areas of my brain, from which all the words I have ever spoken or written emanate, and all the words I have ever heard or read are transformed into meaning. It didn’t look like much – as if a child had absentmindedly taken her crayons to the scans and quickly got bored. The colourful regions were meagre in size compared to the colourless hulk of the tumour, which bore down threateningly.
On the scans, the delicate blobs of blue and red appeared on both sides of my brain. I was relieved to see that the coloured areas clustered together on the left, and that the few on the right had given the tumour a wide berth. The neurosurgeon turned to me with a smile, and told me I would be asleep throughout as he removed the tumour – and that I would wake up able to tell the tale.