On 10 September last year, protons – tiny particles ordinarily found deep inside atoms – completed their first lap around the inside of the Large Hadron Collider, the new particle accelerator near Geneva. Revved up to enormous speeds by supercooled magnets, the protons raced around the LHC’s huge ring, 27 kilometres in circumference. They criss-crossed the French-Swiss border more than ten thousand times a second before smashing into each other, releasing primordial fireworks.
Huddled with my colleagues around a laptop, watching the LHC come online was a thrilling moment, but also, for many of us, a rueful one. Fifteen years earlier, construction on a similar machine, even grander than the LHC, had ground unceremoniously to a halt. It was known as the Superconducting Supercollider, or SSC. As an undergraduate, back in 1992, I worked as an intern for a few months with one of the huge teams designing instruments for the SSC. The accelerator was based outside Dallas, in the small town of Waxahachie. (The town’s other main attraction: Southwestern Assemblies of God University.) In a research article I wrote at the time, I predicted some features of the fleeting, exotic interactions among subatomic particles that the SSC was designed to observe. The first draft began confidently, in the matter-of-fact scientific prose that young students quickly learn to imitate: ‘The high energies and luminosities available when the Superconducting Supercollider comes online have intensified interest in probing various extensions of the Standard Model.’ The eyes of a generation of physicists were focused on the SSC, and on the riches it promised to reveal.
By the time I submitted a revised draft of my article, however, the SSC’s political fortunes had changed dramatically. I dropped all reference to it, substituting something about imagined generations of accelerators, off in the indefinite future. Not long after that, in October 1993, Congress took its final vote to kill funding for the SSC. A few days earlier, a well-meaning young professor had called me into his office. He advised me to leave graduate school if the vote went the wrong way. I stayed, but he didn’t: a year or so later he jumped ship to Wall Street, along with many other students and colleagues. With that vote to kill the SSC, Congress cut annual funding for high-energy physics in the United States by half. Support for the field continued to erode, losing ground against inflation, for the rest of the decade.
Since that time, scientists, policymakers and historians have spilled a lot of ink over the causes of the SSC’s demise. Some point to cost overruns; others focus on deeply felt differences over how to distribute limited resources across the full range of scientific research. All agree that the end of the Cold War was a major factor. Twenty years earlier, in a different political climate, a physicist defending a proposal to build a particle accelerator in the Midwest had answered Congress’s pointed questions about high costs and pragmatic ends by arguing that the laboratory would not aid the nation’s defence – it would make the nation ‘worth defending’. That kind of rhetoric worked then, but only when it was propped up by concerted back-room lobbying. Scientists and policymakers laboured to make sure that generations of nuclear scientists remained well trained and at the ready, should the Cold War ever turn hot. That coalition had dissolved by the time the SSC’s number came up. Nobel laureates promised Congress that the SSC would unlock the secrets of the universe and contribute to an epic adventure of discovery, but their words fell flat. By the early 1990s, with no Soviet menace (real or imagined) to face, the blank-cheque era of American ‘big science’ had come to an end.
A year after the SSC was abandoned, the governing board of CERN, the European Organisation for Nuclear Research in Geneva, approved the plan to build the LHC. CERN realised they could achieve similar goals to the SSC’s, but much more cheaply. Most important, they decided to use a tunnel left over from an older experiment to save on the huge excavation costs. The choice was not ideal: using the older facility meant sticking with a colliding ring that was only one-third as long as the SSC’s would have been. The size of the ring has a direct bearing on the energies that the colliding particles can attain; these too would be only one-third as high in the LHC as they would have been in the SSC. But the CERN machine could reach nevertheless enormous energies at one-fifth or one-quarter of the cost of the SSC. And so we celebrated on 10 September 2008 – 14 years after the commitment had been made – when the LHC spun to life and sent its first batch of protons circling around and around and around.
But what is it for? Why spend billions of euros to smash subatomic particles together? The physicist Frank Wilczek (a colleague of mine at MIT, though we haven’t worked together directly) shared the 2004 Nobel Prize for his contribution to the reigning theory of high-energy physics; his new book gives non-specialist readers a tour of the conceptual landscape. In the opening chapters, he introduces what physicists call the ‘Standard Model’, which describes the forces and interactions among all known subatomic particles, from garden-variety quarks and electrons to their exotic cousins, seen only in carefully controlled laboratory settings. Cobbled together during the 1960s and 1970s, the Standard Model has been tested to high precision since the 1980s; no experiment, to date, has managed to poke a hole in it. Yet physicists agree that the Standard Model cannot be the final word. For one thing, it has a number of arbitrary, unexplained features. Why does one particle, the muon (otherwise so similar to the electron), happen to weigh 206.7683 times more than its lightweight sibling? Why do two particular interactions have strengths in the ratio 0.2312, rather than, say, 1, or 0.25, or 17? By sticking those parameters into their calculations, physicists can match experimental results with unbelievable accuracy. But accounting for why those values must be used, just so, remains an open question. For several decades, the name of the game for high-energy physicists has been to account for these parameters in a first-principles sort of way; to plug the Standard Model into some larger framework, in terms of which its arbitrary features might seem natural, even necessary.
In fact, most physicists consider the Standard Model glaringly incomplete. It incorporates three of the four basic forces of nature: the forces that cause electric charges to attract or repel; those that cause nuclear particles to clump densely into atomic nuclei; and those that cause some atomic nuclei to disintegrate via radioactivity. But the Standard Model has nothing at all to say about gravity, which, on a cosmic scale, is by far the most important force out there.
Most of the efforts to redress these shortcomings – to rectify the arbitrariness of the existing Standard Model, and to smuggle gravity into it – focus on symmetry. This is a major theme in Wilczek’s book. He is fascinated by the apparently alchemical conversion of thought-stuff, such as abstract mathematical symmetries, into testable, almost touchable features of our physical world.
Symmetry is that feature of a system whereby it remains unchanged as you shake it up or twist it around, or, as physicists say, undertake a transformation. Imagine playing a Bach fugue on the piano. Without your knowing, some gremlins have shifted your keyboard up by a major third: every time you play what looks like middle C, the piano sounds an E; you hit a D but the hammer sounds an F-sharp, and so on. If each note is affected in the same way, independently of its position on the keyboard, and if the rules don’t change over time, then the gremlins have performed a ‘global transformation’. The intervals between notes have remained intact, but the piece is not unchanged: someone with perfect pitch could detect the difference. However, the fugue would remain unchanged – symmetric under this global transformation – if, equally unknown to you, helpful elves living inside the piano hooked up an elaborate contraption of pulleys and gears so that when the hammers began to fall, the elves’ wheelworks redirected them to the originally intended strings. By adding in new types of force and interaction, the elves have compensated for the gremlins’ global transformation, leaving the composition completely unchanged overall.
More complicated transformations are possible, too. The gremlins, for example, might dream up a unique transposition for each note on the keyboard: middle C moves up to E, while D moves down to B-flat, and so on. Or, the gremlins might change their minds and make up different transpositions over time, so that later the middle C hammer strikes a G while the D sounds a D-sharp. Physicists call such manoeuvres ‘local transformations’. With the right combination of gears and pulleys, however, the elves could still render your fugue unchanged from the original, if they constantly adjusted the machinery to compensate for the gremlins’ place and time-specific transformations.
The forces described by the Standard Model – forces that bind nuclei together or make them fall apart – remain symmetrical under local transformations. Decades ago, physicists postulated the existence of special particles associated with compensatory nuclear forces which, just like the elves’ machinery, guarantee overall symmetry. The properties of these particles, if they existed at all, would be determined by the types of transformation they needed to overcome. And when experimenters went looking for them in the early 1980s, there they were, pretty much exactly as the theorists had expected (and desperately hoped) they would be.
That is the kind of alchemy Wilczek delights in. Small hints gleaned from various experiments suggest that some underlying symmetry might govern a physical force. Mental gymnastics – often more elaborate, far-fetched and just plain bizarre than my gremlin-elf-piano parable – predict that some new, tiny thing might be out there scurrying around, shoring up that underlying symmetry. New experiments are designed in the hope of catching a fleeting glimpse of the elves at work, or at least to gather empirical data that might plausibly be attributed to the dreamed-up interactions. Wilczek calls this process a migration from ‘c-world to p-world’, from a realm of concepts to the physical world around us. The end products, he says, are ‘embodied ideas’.
The Standard Model was pieced together that way, a frenetic zigzag between theory and experiment. But as more pieces of the puzzle were put in place, filling the next gap often meant pushing to higher and higher energies – and that usually meant whipping particles around huge particle accelerators until they were travelling at very nearly the speed of light, and then smashing them together. The energies with which the particles collided would coagulate into new varieties of matter, as indicated by Einstein’s famous relation between energy and mass, E = mc². By the mid-1980s, the energies required to push the boundaries of the Standard Model had outstripped what existing accelerators could achieve. That was the reason for the SSC; and why so many now turn, hopes renewed, to the LHC.
At the top of the wish list for most high-energy physicists is finding some direct evidence for the Higgs particle. Named after the Scottish theoretical physicist Peter Higgs (though, as Higgs has insisted all along, about half a dozen theorists deserve the credit too), the hypothetical Higgs particles are assumed to be everywhere, filling every nook and cranny of the universe. According to the Standard Model, nearly all other types of matter bump into Higgs particles all the time. That is what (theorists suppose) gives elementary particles their mass: particles that would otherwise weigh nothing at all remain trapped within the molasses-like Higgs soup; the constant jostling by Higgs particles interrupts other particles’ motion, slowing them down and making them appear to lumber around like big, heavy objects (heavy, that is, on the atomic scale). The Higgs particle is the last remaining member of the Standard Model to have eluded detection. For the Standard Model to hang together at all, the Higgs must reveal itself at the energies attainable by the LHC. If it turns out that there is no Higgs particle to be found, high-energy physicists will be forced to go back to the drawing board for the first time in decades.
The Higgs mechanism, if it occurs at all, gives mass to basic constituents such as quarks and electrons. But what about conglomerations of quarks, like protons and neutrons? Nearly all the matter we know – you, me, just about everything we can see here or in the heavens – consists of protons and neutrons. (There seems to be quite a lot of matter in outer space that we can’t see, known as ‘dark matter’, but one cosmic mystery is enough for now.) Yet only about 5 per cent of the mass of ordinary matter can be accounted for by the mass of their in-dwelling quarks. Ninety-five per cent of a proton’s mass – and, by extension, 95 per cent of the mass of you and me – comes from raw energy. Hence Wilczek’s title: The Lightness of Being. Mass does not arise from clumping lots of heavy items together; it comes very nearly from nothing, from the feverish quantum dance of massless particles. Wilczek spends much of his book explaining how that could be.
His main heroes are the gluons. Gluons are one species of the elves in my Bach parable: they skitter around enforcing a particular symmetry. The symmetry they regulate governs the strong nuclear force: that is, the force that binds quarks together into composites like protons and neutrons. As their name implies, they are nuclear glue. Gluons do not have any mass of their own – they avoid all those jostlings with the Higgs particles – but they interact with each other and with quarks all the time. Most important, they stay true to their elven ways. If you try to disturb the symmetry they guard – for example, by placing a lone quark in isolation – gluons leap into action, dredging up other quarks with compensating nuclear charges to cancel out the first quark’s charge and restore the overall symmetry.
The cancellation would be complete if the compensating quarks could be forced to sit directly on top of the original quark; no quark-charge would then spill out to threaten the nuclear-force symmetry. But complete cancellation is hindered by a competing factor. The Heisenberg uncertainty principle, the central pillar of quantum theory, stipulates a trade-off between how precisely a quantum object’s position and momentum may be specified. In other words, nothing – not even gluons – can force quarks to sit perfectly still in a fixed location. The more gluons act to keep the new quarks fixed squarely on top of the original one, the more energetically those quarks jump around. At the natural balancing point between those two tendencies – cancelling the original quark’s charge as much as possible while minimising the new quarks’ thrashing about – some residual energy remains. We see that energy in the form of a proton’s mass. Mass, you might say, is nothing but a cosmic accounting error.
Preposterous as this account of the origin of mass might sound, the latest computer simulations reveal a remarkable match with empirical measurements. Experiments at the LHC and elsewhere promise to throw more light on the quantum choreography of quarks and gluons.
Perhaps even more important, the LHC offers the first chance – a long shot at best – to find some empirical evidence of the merging of gravity with the other forces. During the long interlude between the death of the SSC and the early stirrings of the LHC, there was bitter in-fighting in the high-energy physics community over how such a merger might occur. The leading contender, string theory, rivals the story of gremlins and elves in its challenge to common sense. If you thought our universe had just three dimensions of space – height, width and depth – think again: string theorists’ models require at least six more. String theory is chock-full of insults to our intuition. On the other hand, it does seem to accommodate both the Standard Model and Einstein’s theory of gravity as low-energy approximations. Electromagnetism, nuclear forces and gravity might be shades of a single unified force after all.
But how to tell? Critics of string theory complain about the complete lack of experimental evidence in support of its outlandish propositions. Proponents shoot back that there is a new game in town: different criteria, such as mathematical consistency or elegance, should weigh more than close fits to (non-existent) data. Anticipation of the LHC has helped to heal some wounds. The machine could conceivably catch a glimpse of extra dimensions, if they are there to be glimpsed; or it might find the first evidence of ‘supersymmetry’, a hypothetical symmetry among entire classes of particles that string theory requires and many high-energy physicists hope to find. None of these would constitute direct tests of the central tenets of string theory. But the overheated rhetoric seems to have quietened down while all sides wait, anxiously, for news from Geneva.
Wilczek largely stayed out of the mudslinging, and his book expresses a quiet agnosticism on the matter. If nothing else, the book offers a good way to pass the time until the LHC reaches its optimum energy. Just nine days after its triumphant kick-off in September last year, faulty electrical connections deep inside the LHC tunnel caused several magnets to overheat. That, in turn, caused ruptures in tanks of liquid helium, the coolant that ordinarily keeps the magnets humming at temperatures colder than those of outer space. Repairs kept the LHC offline for more than a year. At the end of last month the massive machine spun continuously back to life. Staff are gingerly coaxing the machine up to its design energy, at which point new physics might (just) appear amid the detritus of all those proton collisions.