Proxies: The Cultural Work of Standing In 
by Dylan Mulvin.
MIT, 274 pp., £40, August 2021, 978 0 262 04514 8
Show More
Show More

The flood​  of images is inescapable. Pick up your phone or surf the web and you will be immersed in pictures of war-torn cities and grinning politicians; friends’ holidays and strangers’ parties; heavenly salads and impish kittens. Yet despite their numbing infinite variety, these images all owe their sharpness and vibrancy to the legacy of a single photograph: a picture of a naked woman in a cluttered attic, a straw hat with feather trim worn at a rakish angle, face turned back over her shoulder to fix the camera with a bright, enigmatic eye.

This is the Lena image (sometimes ‘Lenna’), which for nearly fifty years has been the default test subject for image compression algorithms, including the now ubiquitous JPEG format. Its use for this purpose began in the summer of 1973, when a group of researchers at the University of Southern California’s Signal and Image Processing Institute were casting around for a new picture to practise their work on. Tired of the usual stock photos (pictures of glossy bell peppers and colourfully masked mandrills), they turned to a familiar source of enticing imagery: Playboy. Reports diverge as to how exactly the magazine found its way into the lab; some say the men sent out for a copy, others that a passing engineer happened to wander in while flicking through the pages of the November 1972 issue (coincidentally, the best-selling Playboy of all time). Either way, the group immediately lit on the centrefold: a picture of the Swedish-born model Lena Forsén, hat askew, clothes asunder. The copy identified her as Lenna Sjööblom (‘TURN-ONS: My husband. TURN-OFFS: Men who wear shorts with white socks and black shoes’), but to the men of SIPI she was merely a perfect test subject. They tore off the top third of the page to fit her into their scanner, cropping the shot below the shoulder, removing the nudity, and rendering her as a digital image 512 x 512 pixels in size. And so Lena was enshrined for ever as the ‘patron saint of JPEGs’, used in countless research papers and academic journals to demonstrate the fidelity of algorithms. As the president of the Society for Imaging Science and Technology later put it, ‘the use of her photo is clearly one of the most important events in the history of electronic imaging.’

Engineers have argued that the Lena image was taken up purely for its formal attributes. First, there was the paper stock itself: glossy and high-quality, as Hugh Hefner demanded, it was ideal for scanning. Then there was the picture’s composition, which provides a number of visual challenges to test the mettle of any algorithm: different textures (flesh, hair, glass, wood, fabric); extremes of light and shadow; varying depths of focus; regions of flat colour (Lena’s bare shoulder) and fussy detail (the feather on the hat). Most important, though, it centres on a human face: since we know what a face should look like, any shortcomings in digital reproduction are quickly exposed. When pushed, researchers will acknowledge that the fact the image is of a naked woman presumably played some part in its popularity. But for decades this was considered incidental, or no more than a happy accident.

The ascendancy of Lena in image processing is, then, the result of the usual mix of impromptu decision-making and post-hoc justification that underlies a surprising amount of scientific work. But it is also an example of the way that specific entities used to simplify the production of technical knowledge are revealing of the world that has selected them. The use of the Lena image over decades has contributed to the casual misogyny of the male-dominated field of computer science, a world in which the surveillance and appraisal of women is so ordinary as to make erotic images interchangeable with wildlife photography. Objects such as these are what Dylan Mulvin, a historian of media and technology, calls ‘proxies’. The influence of such proxies, he shows, is sometimes subtle, sometimes obvious, but invariably underexamined. There are economic proxies like the 730 items that make up the consumer price index’s ‘basket of goods’ – a shopping list meant to represent the nation’s spending habits, which has a huge influence on government economic policy. There are legal proxies, like the fiction of the ‘reasonable person’ referred to in discussions of law and contract: the average individual who would behave in a predictable, commonsensical way. There are medical proxies, like the actors who play patients to help train doctors, and in so doing enshrine certain gestures and expressions as authentic markers of illness and pain.

And there are technological proxies in abundance, from the Lena image to measurement standards to the crash test dummies used to assess vehicle safety. In this last example, the work of the proxy is eerily literal: the dummy is a sit-in, rather than a stand-in, for the frail human body crumpled and twisted in cars and trucks. Its influence on the world has been similarly straightforward. Consider that, for many decades, crash test dummies were modelled on the dimensions of the ‘average man’. This meant that safety devices such as the three-point seat belt were optimised for male bodies. A study in 2011 found that, as a result, injury and fatality rates for women drivers was 47 per cent higher than for men in comparable crashes.

Mulvin’s concept of the proxy isn’t wholly new. Stand-ins have long been an obsession in science and technology studies, or STS: an interdisciplinary domain that began to emerge in the 1960s, applying sociological and anthropological analysis to scientific and technological developments. Some practitioners pick apart the cultural forces that shape stand-ins, revealing the assumptions and prejudices they have helped to embed. Texts in this genre include Langdon Winner’s essay ‘Do Artefacts Have Politics?’ from 1980; Lorraine Daston and Peter Galison’s book Objectivity (2007); and the essay collection Standards and Their Stories (2009), edited by Martha Lampland and Susan Leigh Star. However you try to define the proxy, though, you’re soon faced with the overwhelming plurality of the entities themselves. Mulvin tries to pin down his flexible construct by piling on adjective after adjective. Proxies, he writes, are porous (‘they absorb their surroundings’), sticky (‘they pick up pieces and leave traces of wherever they travel’), bodily (they rely on ‘finely tuned embodied and relational practices’) and amnesiac (their work is ‘invisible and easily forgotten’). He concedes that he is unable to provide a ‘global theory of proxies’, but seeks to distinguish his own conception of the proxy by focusing on ‘embodied labour and performance’.

One of his illustrations is the International Prototype Kilogram, used between 1889 and 2019 to define a standard for the basic unit of weight. The actual IPK is a palm-sized cylinder of platinum-iridium alloy stored in triple-layer bell jars in an underground vault outside Paris. In order to verify this object’s mass and create copies to be used as reference measures around the world, the IPK had to be washed, cleaned and weighed, before being compared to six sibling kilograms. The process, as Mulvin recounts it, is beautifully ritualistic. The cleaner must soak a piece of chamois leather in a mixture of ether and ethanol for 48 hours to ensure full absorption, then rub down the IPK before washing it with steam and removing excess water with filter paper. This entire process is repeated several times. To ensure that no impurities are left on the kilogram’s body (or at least that the same impurities are left each time), every aspect of this sacrament is minutely quantified, from the amount of pressure to be applied with the chamois (10 kPa) to the wattage of the equipment used to generate the steam (350 W) and the distance of the steam nozzle from the kilogram’s body (5 mm).

As Mulvin notes, the ‘conspicuous performance’ of this ritual is a reaction to the sticky and porous nature of proxies. The IPK was intended to be a static data point: a way to maintain a single mass value for all time. But, as a physical artifact, it was unavoidably fallible. All matter must leak and decay, absorb and emit, and the IPK proved no different. The instructions for its care are ‘not supplemental to the meaning of mass within the metric system’, Mulvin writes, but ‘fundamental to what made it a viable standard’. In time, however, the very process of caring for the standard uncovered its weaknesses. Towards the end of the 20th century, a series of weigh-ins revealed that the IPK was losing mass compared to its six siblings. Not by much – just fifty micrograms, or a millionth of a gram – but enough to raise questions about its metrological sovereignty. So in 2019 it was officially replaced, like the other units of the metric system before it, with a definition derived from fundamental physical constants, sacred numbers of reality such as the speed of light.

Mulvin – always ready with the right metaphor – describes proxies as ‘canned decisions’, explaining that their history is full of ‘moments of incarnation’, where their usefulness for ‘conducting some limited, domain-specific task … leaks out into the wild’. In his introduction, he gives the example of Yodaville, an artificial city constructed by the US Marine Corps out of shipping containers, used to practise aerial bombing and urban warfare in the Arizona desert. Yodaville was built after the embarrassment of the battle of Mogadishu in 1993 – a failed attempt by US special forces to capture Somali rebels which resulted in dead American soldiers being dragged through the city’s streets – and was supposed to be an improved stand-in for future US engagements. The idea was to shift institutional thinking away from memories of the European towns and villages enshrined in existing training protocols and towards air attacks over new urban environments. As one journalist describes Yodaville, ‘just name a city in one of the world’s trouble spots, and Yodaville can be it’ – a sentence which, as Mulvin notes, invites us to ‘imagine a limitless horizon for the pursuits of the American military’.

There is a reason Mulvin finds it difficult to come up with a totalising theory of his subject: each proxy he examines is a world in itself. Towards the end, he returns to the Lena image, situating it in a history of such pictures, from the ‘China Girls’ whose faces were inserted like subliminal messages at the start of film reels to calibrate colours from the projection booth, to the ‘Shirley Cards’ distributed by Kodak to fulfil a similar role in commercial photography, and the first ever Photoshopped image, titled ‘Jennifer in Paradise’, which shows the then girlfriend of one of Photoshop’s creators topless on a beach in Bora Bora, her back to the camera. Each time, Mulvin shows the way these standards unwittingly encoded the culture of their time. In the case of the Shirley Cards, the exclusive use of white models in early calibration tests meant that the world’s most popular film stock failed to capture the detail of other skin colours. ‘Whiteness moves through these standards with ease, whereas darker skin creates friction,’ Mulvin writes. Beginning in the 1960s, Kodak slowly began to fix this bias in its film, but not out of any sense of racial injustice: it was a response to complaints from furniture makers and chocolate sellers that Kodak cameras couldn’t properly capture their products’ hues.

Today we can see similar patterns at work in the field of artificial intelligence, where algorithms are trained on data sets that are supposed to capture the variety of the world but too often reflect the biases of their authors. In 2018, the computer scientists Joy Buolamwini and Timnit Gebru showed that the facial recognition algorithms built by Microsoft and IBM were far less accurate when trying to identify non-white faces – a reflection of the bias of their training data. As algorithmic mechanisms are embedded ever more deeply in our lives, handling decision-making in such domains as policing, healthcare and education, the bias of these systems becomes increasingly problematic, reinforcing and amplifying existing structural inequalities. During lockdown, many students in the US were forced to install ‘proctor software’: surveillance programs that can access the camera on a laptop to check that the student doesn’t leave their desk during exams. One analysis of a widely used piece of software, called Proctorio, found that it failed to identify Black faces more than 50 per cent of the time. Non-white students were faced with the extra pressure of making themselves visible to an AI surveillance camera. As one mother complained on Twitter, ‘Daughter 1 was taking an exam today being proctored by some type of software that apparently was not tested on dark skin. She had to open her window, turn on the lights, and then shine a flashlight over her head to be detectable.’ Another Twitter user responded: ‘THIS! There’s no reason I should have to collect all the light God has to offer, just for Proctorio to pretend my face is still undetectable.’

Sometimes there are reckonings, of a sort. In the late 2010s, scientific and technical journals began to recognise the harm the Lena image had caused. Many introduced bans on the picture’s use ‘without convincing scientific justification’, in the hope of creating a more welcoming environment for women in their field. But Mulvin notes that this change wouldn’t have happened without increasing cultural awareness of the Lena image: ‘Beyond just a winking reference to insiders, the image accrued a wider reputation as an icon of misogyny and misrepresentation within the world of computer science and its allied fields.’ A documentary from 2019, Losing Lena, featured Forsén herself, calling for an end to the use of her image. ‘I retired from modelling a long time ago. It’s time I retired from tech too,’ she said in a statement to the press. ‘We can make a simple change today that creates a lasting change for tomorrow. Let’s commit to losing me.’ Still, Lena’s image lives on, appearing in such pop cultural venues as the HBO sitcom Silicon Valley, in which the hero-coder is inspired by her picture to create a world-beating compression algorithm. Search for Lena now, and one result is from Playboy’s own website. A brief editorial celebrates ‘Lenna Sjööblom’ (not Lena Forsén), as an ‘internet icon’ who ‘played a key role in the development of electronic image processing’. Scroll down to the bottom of the page past a string of non-nude images and an advert hits you: ‘Want to see Lenna’s NSFW gallery? Join Playboy Plus to access.’

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address, and a telephone number.

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences