How worried should we be?

Steven Shapin

  • BuyCommand and Control by Eric Schlosser
    Penguin, 632 pp, £25.00, September 2013, ISBN 978 1 84614 148 5

‘Anything that can go wrong, will go wrong.’ That’s known as Murphy’s Law. It’s invoked in all sorts of settings, but its natural modern home is in engineering, where it is generally attributed to a remark made around 1950 by an aeronautical engineer called Ed Murphy, who was working on the design of rocket sleds at Edwards Air Force Base in California. In the mid-1950s, when Murphy’s Law wasn’t yet widely known under that name, Admiral Lewis L. Strauss, reflecting on the political and administrative troubles afflicting him, suggested that ‘a new law of knowledge’ be recognised and called Strauss’s Law after him: ‘If anything bad can happen, it probably will.’ At the time, Strauss was chairman of the Atomic Energy Commission, which had the responsibility for producing and maintaining America’s nuclear weapons, and the things that can go wrong with the control of such weapons are as bad as it gets.

Nuclear weapons are designed to detonate as the result of specific types of human intention. Explosions are the sharp end of elaborate, and constantly evolving, ‘command and control’ systems put in place to ensure that these weapons are used only as and when intended by legitimate political authority. Although there were concerns at one time that a high proportion of nuclear weapons would turn out to be damp squibs, or miss their targets by a wide margin, their designers, at least in the original nuclear states, are now confident that they will for the most part work as they are meant to.

But nuclear weapons can, in theory, go off accidentally. There have long been arguments about the chances of accidental explosions – failures of command and control in which weapons are detonated when no one intends they should be or when control is seized by an illegitimate party. Some people believe that the risk of accidental detonation has always been oversold. First, the novels Red Alert (1958) and Fail-Safe (1962), and then, based on Red Alert, the 1964 film Dr Strangelove, put in play the idea that all-out nuclear war could happen as a result of technical flaws or through the actions of one or a few madmen, but, despite all the Cold War cold sweats, we’re still here. After 9 August 1945, there has never been either an accidental nuclear detonation, or, several thousand tests excepted, an intentional explosion of any sort. The US has built some 70,000 nuclear weapons since the end of the Second World War, and currently possesses about 4650, none of which has yet detonated accidentally or without authorisation. So how worried should we have been about nuclear explosions, intentional or accidental? How worried should we be now? What has been the relationship between the possibility of accidents and the command and control systems meant to prevent them and to guarantee intentional outcomes, or between nuclear risk and the political structures in which nuclear weapons are embedded?

Since 1947, the Bulletin of the Atomic Scientists has had a Doomsday Clock icon on its cover, set to indicate how close to Armageddon we’re reckoned to be. In the beginning, when the US was the world’s only nuclear power, holding only a few atomic bombs, the clock was set at seven minutes to midnight. In 1953, with both the Soviet Union and Britain joining America as nuclear states, and with the introduction of thermonuclear weapons, it was advanced several minutes. Since then, the time on the clock has varied between 11.58 and 11.43, reflecting test-ban treaties, arms races and nuclear proliferation – but, again, we’re still here. Immense stockpiles of weapons vastly more devastating than the Hiroshima and Nagasaki bombs have been accumulating for almost seventy years, guarded by fallible human beings, loaded on ready-to-go bombers and mounted on missiles primed to fly at a moment’s notice, but the world hasn’t ended, and over time it’s become more difficult to work up collective hysteria, or even serious concern, about the possibility of nuclear annihilation, intended or accidental. It’s a state of affairs sometimes offered as solid proof that the use of nuclear weapons solely as a deterrent is highly effective and that the systems for keeping them safe against accident or theft work flawlessly. If you really need a bout of apocalyptic anxiety, then worry about climate change, or pandemic influenza, or drug-resistant bacteria, or meteorites doing to us what they did long ago to the dinosaurs.

That’s one way to steady nuclear nerves: learn to stop worrying and accept the bomb, even if you can’t bring yourself to love it. It’s a position that has its advocates. A few years ago, John Mueller’s Atomic Obsession: Nuclear Alarmism from Hiroshima to al-Qaida urged a relaxed attitude: far more has been spent on nuclear weapons than can be justified by any sensible political strategy; they aren’t of much military use; their proliferation presents little danger; fears of nuclear accidents aren’t justified. Historically, wars just don’t begin by accident and there’s no reason to think that nuclear war would be an exception. A sober effort to reduce the chances of an accidental nuclear explosion might be worthwhile, but ‘the myopic hype and hysteria that have so routinely accompanied such efforts are not.’ Writing in the Daily Telegraph recently, Gerard DeGroot declared that the system for preventing nuclear accidents ‘works’, and that, even if an accidental explosion did occur, it would be unlikely to mean the end of the world. There is a market, he said, for books that frighten us and little or none for those that reassure: ‘Apparently we prefer hysteria to soothing logic in matters atomic.’

You are not logged in