In the latest issue:

An Ordinary Woman

Alan Bennett

Anglo-America Loses its Grip

Pankaj Mishra

Short Cuts: John Bolton’s Unwitting Usefulness

Mattathias Schwartz

Smells of Hell

Keith Thomas

Mrs Oliphant

Tom Crewe

Tippett’s Knack

Philip Clark

At Tate Modern: Steve McQueen

Colin Grant

Catherine Lacey

Nicole Flattery

Churchill’s Cook

Rosemary Hill

The ‘Batrachomyomachia’

Ange Mlinko

On Dorothea Lange

Joanna Biggs

Paid to Race

Jon Day

Poem: ‘Traveller’s Tales: Chapter 90’

August Kleinzahler

The Soho Alphabet

Andrew O’Hagan

Old Tunes

Stephen Sedley

Victor Serge’s Defective Bolshevism

Tariq Ali

The Murdrous Machiavel

Erin Maglaque

Diary: Insane after coronavirus?

Patricia Lockwood

When big was beautifulNicholas Wade
Vol. 14 No. 16 · 20 August 1992

When big was beautiful

Nicholas Wade

2791 words
Big Science: The Growth of Large-Scale Research 
edited by Peter Galison and Bruce Helvy.
Stanford, 392 pp., $45, April 1992, 0 8047 1879 2
Show More
The Code of Codes 
edited by Daniel Kevles and Leroy Hood.
Harvard, 397 pp., £23.95, June 1992, 0 674 13645 4
Show More
Show More

Under the Reagan Administration the United States embarked on a fistful of big science projects, from the space station to the superconducting supercollider and the human genome project. The usefulness of these ventures, by and large, lies in inverse proportion to their cost. The $30,000 million space station will serve little detectable purpose save making work for hungry defence contractors, whereas the $3000 million human genome project could one day allow the history of evolution to be read like a book. On the scale of moral worth, the $8000 million superconducting supercollider lies nearer to the human genome project, but that may not save it. The atom-smasher is designed to create energy conditions not seen in the universe since a trillionth of a second after the Big Bang, but before reaching 15,000 million years back in time, it must first survive until Congress retreats for the summer. In June, with tunnelling machines already boring into the chalk beneath the plains of Waxahachie, Texas, the House of Representatives abandoned its usual support for the project and voted to kill it.

Pundits at once interpreted the vote as the death knell for the pharaonic era of big science projects. In fact, the Representatives were moved not by any such lofty judgment but by one of the mundaner forces that usually animate their voting behaviour, in this case, revenge. A Texan Representative had nearly secured passage of a resolution calling for a balanced Federal budget which, had it passed, would have gutted many pork-barrel projects and ensured the defeat of many of his colleagues at the next election. Seeing the supercollider roll past, the greasiest pork-barrel project in all Texas, his loyal colleagues just couldn’t help themselves. If resurrected by a later vote in the Senate, the supercollider will have passed its year of maximum political danger.

In fact, Congress is generally supportive of big science projects, partly for pork-barrel reasons and partly from a hazy notion that science is good for the economy. The most serious opponents are usually other scientists in the same discipline, who fear that their small-science projects will be starved as funds are diverted. It is from these lively critics that big science has acquired a far more pejorative aura in academia than the can-do image it enjoys publicly would suggest.

One of the several disappointments of Big Science is that it attempts no definition of its subject. The authors traipse haphazardly from particle accelerators to military science and even to industrial research, and the editors have made no discernible attempt to impose discipline, develop an analytic framework, or tease out coherent themes. This is a pity, since big science projects, requiring money, organisation and specialised knowledge all in high degree, are an interesting species of human endeavour.

Big science means different things to different people because the core of the notion is anything that is not little science. Little science is research done the old-fashioned way, by a single scientist following his own curiosity and free of outside direction and interference. Big science is large-scale research undertaken by massive teams of scientists and engineers, and so expensive that the patron is always a government. Big science projects tend to be characterised by a hefty degree of bureaucratic controls and direction. In addition, many have a specific goal, like reaching the Moon or finding a cure for cancer.

The first question to ask about big science, one so obvious that it eludes the editors of this book, is that of whether it works. The broad answer is that the track record of big science projects is surprisingly good, especially in comparison with other things that governments spend money on. The Apollo project to land men on the Moon fully met its technical and political goals. Whether the goals were worthwhile can be debated; the point here is that it achieved 100 per cent success, a claim that few other government programmes can make.

The goal of a big science project must be chosen with great care, however. The War against Cancer, launched by President Nixon to find a cure within a decade, met its political goal of pre-empting a similar initiative pushed by Senator Edward Kennedy, but as a technical achievement left much to be desired. It was simply premature in the Seventies to hold any realistic hope of addressing the fundamental nature of cancers, since the necessary biological methods were only just then being developed. Nonetheless, the millions of dollars poured into the pipe-dream of finding the virus that causes human cancers were not wholly wasted. The expenditures laid the groundwork of virological expertise from which the concept of proto-oncogenes was to emerge a decade later.

There is no more poignant example of a big science project poised between fame and fiasco than the Hubble space telescope, launched in 1990 at a cost of $1500 million. Built by much the same people and methods that produce spy satellites, the Hubble’s main difference is that its eye peers upward to the heavens instead of peeking down into missile silos. The telescope sends back magnificent pictures, yet because of an appalling error that resulted in its mirror being ground to the wrong curvature, the pictures are fuzzier than they should be. Instead of seeing ten times more clearly than ground-based telescopes, the Hubble sees only slightly better. As recounted in this volume by Robert Smith, design of the space telescope was the subject of feverish politicking by rival groups of astronomers, since different light receptors best served the interests of those studying stars and those scanning the planets. The verdict on the design is not yet in, especially as it is hoped that astronauts will be able to visit and repair the Hubble’s tragically flawed vision.

A trenchant criticism of big science, highly prescient in regard to the Hubble, was made several years ago in an essay by the physicist Freeman Dyson. Big science projects take so long to build, he noted, and the advance of technology is so quick, that by the time the big instrument is finished there are often cheaper and better ways to the same end. The nemesis of the Hubble telescope is the Star Wars missile defence system, whose designers wanted to transmit ground based laser beams through the atmosphere. To counteract the frequently shifting density of the atmosphere, which messes up the coherence of laser light, they developed an ingenious system known as adaptive optics which measures the degree of turbulence in the atmosphere and continuously distorts the curvature of a transmitting mirror to compensate. Adaptive optics can be packaged in a fairly cheap kit for attachment to most telescopes. It will thus allow ground-based observers to eliminate almost completely the haze of the atmosphere – the very point of putting the Hubble telescope in space. Had anyone been able to predict this neat device, the Hubble telescope would surely never have been built.

Big science is generally considered to include the Manhattan project to develop nuclear weapons, since this endeavour required new scientific principles as well as substantially novel engineering. To some extent, the title could be extended to the many other military projects, from the development of generic technologies like computers, semiconductors and gyroscopes to specific advanced weapons systems, like the Trident II submarine-launched missile, which takes a mid-flight navigational star-sighting to improve its accuracy, or the B-2 and F-117A versions of radar-evading aircraft.

The US Defence Department’s strategic military projects seem to have been largely successful, although that verdict must remain a matter of conjecture, since, fortunately, they never faced a weapon’s only meaningful test, performance on the battlefield. The Air Force’s reluctance, however, to let the B-2 bomber participate in the Gulf War suggests strong doubts about the plane’s abilities. Among conventional arms, by contrast, there is intriguing evidence that some of the best weapons in the Pentagon’s arsenal were developed in defiance of the military research bureaucracies, not by them. The Sidewinder air-to-air missile, an infra-red guided weapon decisive in air combat from the Bekaa Valley to the Falklands, was developed over the sustained opposition of Navy officials who preferred radar-guided missiles, just as the splendid F-16 fighter and A-10 attack plane had to be imposed by Defence Secretary James Schlesinger on an unwilling Air Force. The Army’s research bureaucracy has been particularly accident prone. Its attempt to build a successor tank to the undistinguished M-60 series went through two costly fiascos and al-most two decades before it finally came up with the M-1. Its Aquila programme to build a military reconnaisance drone was a ludicrous exercise in adding so many fancy features that the unmanned aircraft had lost its principal virtues of being cheap and expendable before it even left the testing ground. The Defence Department, in short, though it is the major practitioner of big science, has no guaranteed recipe for success.

An interesting chapter by Lilian Hoddeson, focused on the plutonium bomb implosion programmme at Los Alamos, suggests that the military methods successfully used by General Groves to prosecute the Manhattan project – a ‘combination of strict deadlines and lavish support’ – were of wide influence on post-war American research. On their return to academia, the Los Alamos physicists applied their experience of the military approach to many areas of large scale research, so much so that it has become ‘part of the fabric of American big science’.

The human genome project, strangely enough, owes its origin to the nuclear weapons designers at Los Alamos, since it was started there in 1987 when they perceived the wisdom of diversifying out of the bomb business. The history and current preoccupations of the project are well described in The Code of Codes, although some of the more interesting chapters are biologists’ apologias addressed to other scientists rather than exegeses for the general reader.

The need for apologias arises from biologists’ traditional idea of the way to do science: by smart, clear-cut experiments designed to elicit a precise yes or no from nature. This is the antithesis of the data-gathering approach of the human genome project. In their view, determining the order of the three billion letters in the ‘code of codes’ would not be an experiment at all, just a mind-numbing exercise in applied chemistry.

Given this deep-rooted disdain for inductive science, leading biologists were quick to condemn the idea of sequencing the human genome when it was first proposed in 1985, by Robert Sinsheimer, a distinguished biologist and Chancellor of the University of California at Santa Cruz. The idea was taken up instead by Charles DeLisi, a physicist and senior official at the Department of Energy, the patron agency of Los Alamos and the other nuclear weapons design labs. Los Alamos justified its interest as being to study how the genome responds to radiation.

At that point, as Daniel Kevles relates, biologists realised they were making a mistake, since control of the project would pass out of their hands. In the course of lobbying for their own patron agency, the National Institutes of Health, to be cut in on the action, they began to persuade themselves that the project had scientific merit. The NIH’s money has been administered so as to avoid the disliked features of big science, with a minimum of central control and hierarchy.

Nature requires only five million pieces of information, the units being the base pairs strung along the chemical backbone of DNA, to specify a bacterium. She needs 100 million units for the blueprint of a simple roundworm, 180 million for that of a fruitfly, and a great leap to 3000 million units to make a mouse. Strangely enough, the genetic specifications for a human are no more complex: 3000 million base pairs suffice.

The formal goal of the project is to determine the order of the bases in the human genome, as well as those of other much studied organisms for sake of comparison, including the Escherichia coli bacterium, the Caenorhabditis elegans worm, the Drosophila fly and the mouse. Although several of the essayists annoyingly refer to the human genome as the Holy Grail, probably a reference to the terminus ad quem of the adventure film Raiders of the Lost Ark, it is very far from an end in itself. The sequence is a code, utterly meaningless without interpretation. It is the interpretation of the human genome’s sequence that will be biology’s greatest task.

The sequence of several viruses, from polio to Aids, has already been determined, but although the sequence fully determines the nature of the virus, biologists cannot yet translate the sequence into full knowledge of how the virus works. They can often detect where genes start and stop, and figure out the linear sequence of the protein which the gene specifies. But no one can yet predict how a linear protein chain will be folded up into its working structure, from which one might hope to guess its function.

In one of the most interesting essays in this volume the Harvard biologist Walter Gilbert predicts boldly that the protein-folding problem will be solved in five years, and the human genome will be fully sequenced in fifteen: from these two advances, he says, ‘a theoretical biology will emerge.’ Among other things, the theory will enable biologists to recognise the body’s mechanisms for having one set of genes expressed in heart cells and another in brain cells.

At some stage the new theoretical biology will start yielding significant medical dividends, though with accompanying ethical problems of varying degrees of novelty. Several essays here are devoted to such issues as health insurance, DNA fingerprinting and eugenics. But beyond the medical benefits, what really interests biologists is the comparison of the human genome with those of other forms of life. The sequences are the language of evolution, and once deciphered will make clear its central mysteries.

Perhaps the major omission of The Code of Codes is that it does not include a major critic of the human genome project. The critics contend that all the serious questions in the project’s purview would be answered anyway by individual investigators in the natural course of research. But the recent sequencing of a yeast chromosome, revealing a large number of unsuspected genes, suggests that the strategy of going for the complete picture may yield a rich harvest of surprises.

There is another big biology project which is less controversial though its annual budget is far larger – the Aids programme. Effective drugs and vaccines for Aids continue to elude researchers and occasional grumbles are heard from biologists that too much is being spent on Aids. But the programme could get a lot more wasteful before anyone denied society’s right to spend what it liked in combating the disease.

All big science projects are expressions of societal will or national purpose. In some, the Government sets the goal and directly administers the programme. In others, the Government provides money for a large instrument, like an accelerator, but lets scientists decide how they will use it. In the most hands-off from of direction, the Government simply makes available money for a specific goal, like Aids research. Scientists prefer the last because it gives them the greatest autonomy, but history rather suggests that directed ventures like the Manhattan and Apollo projects are the more successful. In other words, there could be a case for a directed programme in Aids, to develop a vaccine, say, when the underlying science becomes better understood.

Another important arena of big science is in the vexed field of industrial policy. Although American economists are brought up to believe that new high-technology industries are delivered by the invisible hand of free markets, the embarrassing truth is that in the United States most known specimens, from computers to civil aviation to semiconductors, have been silently fathered by the Pentagon. In Japan, the semiconductor industry was engendered, not by free market forces but as part of a sustained effort by the Ministry of International Trade and industry, culminating in its spectacularly successful Very Large System Integration project of the late Seventies. Some of Tokyo’s big science projects have of course failed miserably, like the fifth Generation computer project to develop artificial intelligence. With one VLSI program, however, an industrial policy planner can afford and should expect a handsome number of wash-outs.

To be fair to the economists, there are special reasons why deliberate industrial policy is more likely to thrive under Japanese capitalism than under the very different variety practised in the West. The nuclear breeder programme, or the laser isotope separation process for enriching uranium, arc examples of technically brilliant projects that failed due to misjudgment of the market. Big science projects to help industry can work in the right conditions, and fail dismally otherwise.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

Please include name, address, and a telephone number.


Vol. 14 No. 17 · 10 September 1992

Nicholas Wade’s review of The Code of Codes (LRB, 20 August) reveals a misconception unfortunately too common among those captivated by the power apparently offered by molecular biology to explain life. ‘Nature,’ he informs us, ‘requires only five million pieces of information, the units being the base pairs strung along the chemical backbone of DNA, to specify a bacterium … 3000 million for a mouse … the genetic specifications for a human are no more complex: 3000 million base pairs suffice.’ The mistake lies in) assuming that the information content of an organism – or even its specification – can be reduced to the number of DNA base pairs in its genome. To start with, this ignores the fact that a good portion of this DNA – more than 90 per cent in humans – is so-called ‘selfish’ DNA with no known informational role at all. The remainder constitutes the 100,000-odd genes the human genome program plans to map and sequence. Lest this statement should seem to reduce the relevant ‘information’ content for humans to an even trivially lower figure, it should be emphasised that the combinatorial powers of these 100,000-odd genes, their capacity to code for and differentially express multiple proteins, vastly increases the base figure for such a ‘specification’.

Even this, however, masks the main issue, which is in what sense genes can be said to ‘specify’ an organism. This modern version of preformationism fails to take into account that an organism is not the sum of its genes, but the expression of a developmental process which results, in the human, in some thousand billion cells organised into tissues and organs in interaction with each other and their environment. The human brain alone contains more than ten billion nerve cells, each making up to 100,000 connections with its neighbours, with an ‘information content’ that currently beggars calculability. This is one of the reasons why the informational metaphor is so misleading when applied to living organisms. Anyone interested in seeing this argument developed in more detail should turn to Susan Oyama’s excellent book The Ontogeny of Information (Cambridge, 1985) or, if I may hold a small flag aloft, my own The Making of Memory, published later this year by Bantam.

Steven Rose
Open University, Milton Keynes

send letters to

The Editor
London Review of Books
28 Little Russell Street
London, WC1A 2HN

Please include name, address and a telephone number

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Read More

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences