As I write, the temperature in New York City is 86° F. The relative humidity is 56, the winds are south-westerly at seven mph, visibility stands at six miles. What do those numbers really signify? The temperature doesn’t sound extreme, yet when I leave my air-conditioned house I don’t feel that I’m stepping outside so much as entering another atmosphere. My spirits sag, my lips soon taste of sweat. Is this 86°? Well, yes, because that’s what the New York Times says it is. On the other hand, as any schoolchild knows, what really matters is the humidity, and the increasing capacity of air to retain moisture as its temperature rises.
On another index, the trademarked ‘RealFeel’ reading provided by AccuWeather, the current temperature is actually a sweltering 96°. According to AccuWeather, Real Feel is ‘the first ever temperature index that takes into account multiple factors in determining what the temperature outside feels like’. Here empiricism begins to fray at the edges. If the temperature feels like 96°, then why is it 86°? What will it ‘feel like’ when it really does climb to 96°? To complicate matters, the RealFeel temperature can be the same as, or even lower than the ‘actual’ one. We are accustomed to the relativity of numbers – the dollar is a fixed standard, yet according to the economic climate it will buy more or less – but this variation in a seemingly simple, everyday measure speaks to the problems of quantifying abstract experience.
The 20th century was the century of standardisation. There were initially any number of scales for measuring weather – Gustave Eiffel, for example, used his tower as the test site for an array of barometric and other climatic indices – but just as time was eventually standardised to meet the needs of the railroads and the telegraph, so the measurement of weather was gradually reduced to a few benchmark indices. One of the first significant refinements made to the work of Messrs Fahrenheit and Celsius was the invention of the wind chill index in 1945 by two geographers working in Antarctica, who measured how long it took for the water in a series of plastic cylinders to freeze. The original wind chill index was measured in units that corresponded to the rate of heat loss; one of the geographers, Paul Siple, later regretted this, as the use of units was often taken to imply that the index was an alternative temperature scale. This past winter a US-Canadian team unveiled a new wind chill index without the Celsius figure attached. In time, those numbers will assume a meaning of their own, but for the moment they are raw, as meaningless as g-force measurements to the average non-astronaut.
Now, for the summer, we have the heat index, first proposed in 1979 by the Australian environmental scientist Robert Steadman in an article in the Journal of Applied Meteorology entitled ‘The Assessment of Sultriness Part I: A Temperature-Humidity Index Based on Human Physiology and Clothing Science’. Steadman’s index depends on two variables – temperature and humidity – and a number of fixed hypothetical assumptions concerning clothing, physical activity, time of day and heat loss due to radiant heat transfer and evaporation. It was never the product of a single true formula, however, but of a collection of formulae. What we now call the ‘heat index’ is the US National Weather Service’s best approximation of what those formulae actually mean – the ‘apparent temperature’. There are various other indices: the ‘summer simmer index’, proposed in 1987, produces a temperature equivalent that indicates how hot a particular condition would feel in a desert setting with relative humidity of 10 per cent. Using that index, another meteorologist produced a list of the ten American cities with the worst summer weather conditions (happily presupposing what ‘worst’ means). Most are in Texas, and the absolute worst is Corpus Christi (a place once risibly proposed to tourists as ‘the Italy of America’), which from May to September has an average summer simmer index temperature of 100.1° F.
Measured in terms of deaths and physical damage, there has been no more powerful natural force over the last few decades in the US than heatwaves. The National Climatic Data Center list of billion-dollar disasters is thick with heat catastrophes, from the 1988 drought in the Central and Eastern US, which resulted in damages worth $40 billion and was linked to somewhere between five and ten thousand deaths, to a similar episode in the summer of 1980 that caused damage worth $20 billion and an estimated ten thousand heat-related deaths. There are several reasons why, despite these massive economic and human consequences, heat is under-represented in the catalogue of natural disasters. First, its victims are usually very old, or in bad health, or poor and living in houses without air-conditioning. They are people without voice who are probably quite close to dying in any case. How much longer would that 88-year-old man with a heart condition have lived, after all? Second, heat is experienced by different people in different ways – the Miami resident is acclimatised to a degree the Chicago resident may not be – and one can hardly say that about hurricanes or earthquakes. Lastly, heat does not make good television. There’s no opportunity for aerial sweeps of forest fires encroaching on suburban housing, or pictures of flattened mobile homes, or men wearing windbreakers with ‘FEMA’ (Federal Emergency Management Agency) written on the back.
It’s clear that there’s more at stake than mere scientific refinement in the quest accurately to establish the weather. We already know that average global temperatures have risen one degree over the last century, and that they will rise somewhere between three and eight degrees as carbon dioxide levels double, but a more noteworthy increase may not be in temperature so much as in heat. Global warming means global moistening. Using computer models, scientists at the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory have predicted that the average summer heat index for the entire US will approach 100° F over the next fifty years. We might as well all be living in Corpus Christi then.
The implications of these changes are elucidated in Eric Klinenberg’s revealing and provocative ‘social autopsy’ of the heatwave that struck Chicago in the summer of 1995, resulting in more than seven hundred deaths – the worst heat disaster in Illinois history. Chicago, exposed on the plains and susceptible both to Canadian chill and Caribbean heat, suffers from gruelling winters (I left the city of my birth shortly after the famous blizzard of 1979) and hot, humid summers. There was no summer worse than that of 1995, when the heatwave descended on the city on 12 July and lasted for a week, during which temperatures exceeded 100° and, more importantly, heat index temperatures soared into the 120s.
According to the NOAA, the ‘principal cause of the July 1995 heatwave was a slow-moving, hot and humid air mass produced by the chance occurrence at the same time of an unusually strong upper-level ridge of high pressure and unusually moist ground conditions’. The human consequence was that between 14 and 20 July, according to Klinenberg, ‘739 more Chicago residents died than in a typical week for that month.’ The city was wholly unprepared for the disaster: Mayor Richard Daley was out of town, and returned to a fully fledged health emergency; the generators of the electricity conglomerate Commonwealth Edison malfunctioned owing to the extra demand; city hospitals and morgues were overwhelmed by the crush of heat-stress victims and fatalities. Police fanned out through the city responding to the calls of concerned neighbours, and found ‘apartments with the windows closed and doors chained, no air-conditioning, no fans. The smell was foul, just death.’
Caught in the grip of the heat, the city tried to make sense of what was happening. ‘Killer Heat Wave or Media Event?’ the columnist Mike Royko wondered. The Daley Administration at first denied the death toll cited by its own Chief Medical Examiner, then quietly recanted. ‘We’re not going to talk numbers,’ one official said. Newspaper reporters on weekend shifts sat sifting through death reports, unsure of how to depict the crisis.
When it was all over, Daley released a ‘final report’ entitled ‘Mayor’s Commission on Extreme Weather Conditions’, the cover of which depicted an icon of a snowflake and a burning sun. In the report, the Commission noted the difficulties of helping victims – the often poor and elderly shut-ins – ‘who died because they neglected to take care of themselves’, and strove to present the situation as a ‘unique meteorological event’: ‘we are used to warm summers, but we are not used to what was truly unique.’ That redundant ‘truly’ reveals more than a bit of desperation on the city’s part: these were ‘heat-related’ deaths, ‘acts of God’. Klinenberg, however, wants to get away from the meteorological aberrations to investigate what have been called the ‘biological reflections of social fault lines’. Why did more men die than women? Why were most victims over 65? Why did blacks suffer the highest proportional death-rate?
That most victims died alone is a simple demographic fact, and lends itself to more demography: from 1970 to 1996, the number of people living alone in the US went from 10.9 million to 24.9 million, and the numbers are expected to rise as the elderly account for an ever more significant percentage of the population. Much sociological thought has been devoted to what Klinenberg calls the ‘social production of isolation’. In one footnote he quotes Herbert Gans saying that ‘the six bestselling sociology books in the United States are, in order, The Lonely Crowd, Tally’s Corner, Pursuit of Loneliness, Fall of Public Man, Blaming the Victim and Habits of the Heart. Of these, only Blaming the Victim does not explicitly address the issues of being alone, loneliness or the collapse of community.’ Add to the list Robert Putnam’s Bowling Alone and Alan Ehrenhalt’s The Lost City – which examines the decline of community as it once existed in 1950s Chicago – and it’s clear at least that those thinking about loneliness are not alone. Any number of reasons for social isolation have been put forward, including rising life expectancy (people outlive their social networks), social mobility, the fragmenting of families and communities, and the traditional American spirit of frontier self-sufficiency.
But why did so many people die during those days in Chicago? For one, there are many elderly people in the city who are afraid to answer the door, let alone seek outside help. There are neighbourhoods where social service case-workers try to schedule visits early in the morning. ‘I figure I can beat the gang-bangers and troublemakers if I get out early,’ one tells Klinenberg: ‘They’re still trying to sleep off whatever they did the night before.’ In the four years prior to the heatwave, crime rates had doubled in Chicago’s public housing, partly, Klinenberg suspects, because the city had ordered that people with substance abuse problems should be included in its housing programme.
On the other hand, race and economic standing were not infallible guides to heatwave mortality. In the book’s most interesting chapter, Klinenberg compares the adjoining neighbourhoods of North Lawndale and South Lawndale (known as Little Village), in which a similar number of older people lived alone and in poverty in 1995. Both areas had higher numbers of families surviving at or below the poverty line than was typical for Chicago. Yet during the heatwave North Lawndale had a mortality rate ten times higher than South Lawndale’s. North Lawndale is more than 96 per cent black, South Lawndale more than 85 per cent Latino. In the Latino section, ‘the empty lots and abandoned buildings so prevalent in the African-American area give way to dense concentrations of busy sidewalks, active commerce, and residential buildings packed with more inhabitants than they can hold.’ While North Lawndale lost much of its population from 1970 to 1990, the population of Little Village grew by 30 per cent. Because there was a neighbourhood in place, older residents – not all of them Latinos – were able to find refuge in storefronts and other public spaces, while North Lawndale residents lived in fear of leaving the house. It reinforces a point Jane Jacobs once made: streets matter.
In the months following the heatwave, the story – deemed a ‘summer story’ – gradually faded from the news. People began to lose track of how many people had died, and in their minds the death toll fell. In the early stages of the crisis Daley had said: ‘It’s very hot, but let’s not blow it out of proportion . . . Yes, we go to extremes in Chicago. And that’s why people like Chicago. We go to extremes.’ An offhand remark, perhaps regretted later. Yet in it one hears a small echo of the way of thinking that made Chicago what it was: the somewhat masochistic booster spirit that heaved the city out of the prairie grass. It was ‘all magnificent and wild’, the skyscraper pioneer Louis Sullivan wrote in the 1870s of this former frontier fort, a ‘crude extravaganza’. The city was wresting itself from its surroundings, ‘the primal power assuming self-expression amid nature’s impelling urge’. Others were less sanguine about this conquest of nature. The novelist Hamlin Garland, visiting the city a decade after Sullivan had been there, wrote: ‘I perceived from the car window a huge smoke-cloud which embraced the whole eastern horizon, for this, I was told, was the soaring banner of the great and gloomy inland metropolis.’ With no real geographical advantage, no natural fertility, the city was making itself out of the imagination of its citizens – it was, as one writer noted, ‘pregnant with certainty’.
Some even saw in its climate of extremes the foundations of a higher civilisation. William Cronon, in his magisterial book Nature’s Metropolis: Chicago and the Great West, refers to Alexander von Humboldt’s theory that great cities were possible only in regions with an average annual isotherm of about 50° F, and explains that ‘the white races who would build such civilisations retained their civilised superiority only in a temperate climate that challenged them with extremes of hot and cold.’ It was the advent of summer-packed pork – made possible first by the use of ice, taken from the Great Lakes and stored in giant warehouses; and then by the development of refrigerated railway cars – that allowed the city to become the country’s pre-eminent meat packer, the ‘hog butcher to the world’.
It was after a protracted spell of heat and drought that a fire broke out one night in 1871 and quickly spread through the city’s balloon-framed wooden houses. Three hundred people died, but only a decade later Harper’s was unable to ‘learn of a single life having been lost in the flames’. A guidebook stated flatly that ‘the great fire modernised the city,’ and by the turn of the century the population had exploded from 300,000 to 1.7 million. From the ruins emerged a newer and even grander city, ‘which had no traditions but was making them’. As a character in Henry Blake Fuller’s novel The Cliff Dwellers admonishes, ‘Chicago is Chicago. It is the belief of us all. It is inevitable; nothing can stop us now.’
If the death and destruction wrought by the fire could be harnessed for a founding myth, the heatwave of 1995, which resulted in more than twice as many deaths, seemed to offer no such promise. Klinenberg wants his book to serve as a ‘record of the catastrophe and a call to reconsider its meaning’; already, he writes, ‘the trauma stands as a non-event in the grand narrative of affluence and prosperity that dominates accounts of US cities in the 1990s.’ Chicago has always had a capacity for building towering skyscrapers whose glory overshadowed the less glittering reality elsewhere in the city. The shining White City of the 1893 Columbian Exposition commanded the imagination of the world, even as Jane Addam’s muck-raking Hull-House Maps and Papers drew attention to the ‘black city’ beyond the fairgrounds. H.G. Wells, too, described its ‘gauntly ugly and filthy factory buildings, monstrous mounds of refuse, desolate empty lots littered with rusty cans, old iron and indescribable rubbish’. In 1894 another hot summer produced yet another cataclysmic fire, this one destroying the great White City, and on the heels of its destruction came not boom-times and renaissance but bitter labour battles and social unrest. As Ross Miller writes in American Apocalypse, ‘it is the city’s eternal return to something primitive, its constant doubling back on itself, that finally defines Chicago, the American city that most expressively embodies the conflicting representations of modern life.’ It is, perhaps, possible to imagine that in the heatwave of 1995, the first great city of the 20th century suffered a premonition of the disasters awaiting the great cities of the 21st.