When Ideleted my Twitter account in September last year, provoked not by Elon Musk’s imminent takeover but by the suffocating quantity of royal coverage gushing from every media source, I was left feeling bereft, as any addict is when their drug is taken away. How was I supposed to react to the news now? And if I had no way of reacting to the news, what did I want from the news? Am I even interested in the news, if I have no opportunity to react to it? Being in the digital public sphere without any means to react is a bit like being trapped in a shopping mall without any money.

The timing was especially awkward since, a fortnight later, a news event came along that cried out for a reaction: Kwasi Kwarteng’s infamous ‘mini-budget’, which threw 45 years of economic orthodoxy overboard, provoked a stand-off between the government and the Bank of England, and very nearly triggered a financial crisis. Twitter gives users thirty days to change their minds after deleting their accounts, to prevent impulsive exits (i.e. to re-ensnare recovering addicts). I was still inside my thirty days. Stopping myself rejoining in order to react to this exceptional political event took considerable self-restraint. The moment I came closest to cracking wasn’t in response to the events themselves, though, but when I was tasked with managing my university department’s social media profile and came across this tweet by a prominent conservative commentator:

The louder the squealing from the left, the more certain @KwasiKwarteng and @trussliz will be that they have got this right.

This is the sort of culture war logic that has become known, courtesy of the American right, as ‘owning the libs’, the primary objective of which is to enrage (‘trigger’) the opposition by fair means or foul. In other online settings, it is known simply as ‘trolling’. The tweeter appeared to see the unhappy reactions of the left as the litmus test of good economic policy: Kwarteng was a good chancellor because he was a successful troll. ‘What an absurd way to judge policy!’ I wanted to respond. ‘This is idiotic!’ Yet, of course, in feeling that impulse, I was the one being drawn back into the economy of reaction. Who’s the idiot now?

Our public sphere is frequently dominated by events you could call ‘reaction chains’, whereby reactions provoke reactions, which provoke further reactions, and so on. Last year’s Oscars ceremony is remembered for just such a reaction chain. When the host, the comedian Chris Rock, made a joke about Jada Pinkett Smith’s shaved head, her husband, Will Smith, strode up on stage and slapped Rock in the face on live television. For several days afterwards, countless commentators, celebrities and social media users sought to distinguish themselves by their reaction to ‘the slap’. Inevitably, those reactions provoked further reactions, as debate turned to the merits of the positions taken, and suspicion descended on those who hadn’t yet reacted at all. Everyone waited impatiently for the Academy’s official reaction: would Smith be banned, and for how long? The amount of global attention ‘slapgate’ sucked up in the weeks after the ceremony was considerable.

One particular detail added a layer of intrigue. As a result of the blanket television surveillance of the celebrities in the auditorium, there was footage of Will Smith’s immediate reaction to Rock’s joke, which had been laughter. This impulsive response appeared entirely at odds with the anger he displayed on stage just a few seconds later. Was he acting? Was ‘the slap’ real? Or had his wife, perhaps, demanded that he step up? Every frame of the video sequence was pored over, as if it were the Zapruder footage of the Kennedy assassination.

Thanks largely to the spread of smart scrollable devices in the last fifteen years, a certain concept of public participation – what is now known in the managerial vernacular as ‘engagement’ – is common to events of this sort, and to the way they are framed by the media. The individual is not conceived in the same way as in the liberal philosophical tradition – as an autonomous agent, possessed of reason and interests – or in the psychoanalytic tradition, as shaped perhaps unconsciously by past conflicts and injuries. Instead, each of us (celebrities included) becomes a junction box in a vast, complex network, receiving, processing and emitting information in a semi-automatic fashion, and in real time. Information and emotions bounce between these junctions, mutating as they travel, as instantiated in the memes and jokes that spread virally via social media platforms. In this model, each individual reaction is one more item of information thrown back into the network, in search of counter-reactions.

Identifying​ the mechanisms and patterns of human reaction was one of the founding ambitions of scientific psychology, and what first enabled it to separate itself from philosophy. Societies that pledge to respect the rights and choices of the individual have also developed an empirical fascination with the way individuals are influenced by their environment. Early developments in psychological experimentation by Gustav Fechner and Wilhelm Wundt in Leipzig in the late 19th century focused heavily on mental and neural responses to sensory stimuli, for instance the time that elapsed between a sensation and a conscious reflex.

The behaviourist tradition that came to dominate American psychology in the 20th century, pioneered by John B. Watson after the First World War and later identified with B.F. Skinner, was established with the explicit aim of rendering human responses predictable and thereby controllable. Psychology would abandon any theory of mind in favour of data on observable behaviour. Behaviourists imagined a wholly conditioned, programmable person whose reactions are perfectly compatible with their environment, a fantasy that prompted utopian visions (such as Skinner’s novel Walden Two, from 1948) as well as dystopian ones (such as Anthony Burgess’s A Clockwork Orange, from 1962).

Elements of behaviourism became enmeshed with psychoanalysis in mid-20th-century America via the work of the psychiatrist Adolf Meyer, whose theoretical approach dominated the profession there between the Second World War and the anti-psychoanalytic turn of the 1970s. Meyer’s medicalised co-option of Freud involved observing patients in controlled environments, so that their day-to-day behaviour could be assessed. As Meyer had written in 1908, ‘our comparative measure of the various disabilities is the normal complete reaction.’ A good psychiatrist was one who could distinguish ‘efficient’ from ‘inefficient’ reactions (where ‘efficiency’ meant a conventional adaptation to one’s environment, for example in the way one dresses or uses money), an assessment that could only be made in the controlled environment of the asylum or hospital.

The behaviourist tradition has revealed a lot about how humans (and other animals) respond to different stimuli, and what might count as a ‘normal’ or ‘healthy’ reaction. But it tells us nothing about the vast amounts of time and labour that societies such as ours now invest in actively trying to generate and capture reactions of various kinds, not just in science laboratories or hospitals, but across the economy, public sphere and civil society. The figure of the ‘troll’ is one example of this, the agent who goes out of their way to trigger a reaction – typically an angry reaction – from their target, and laughter from their peer group. But anyone who has visited a famous landmark or picturesque beach in recent years will be familiar with a phenomenon that’s no less strange: photographs meticulously staged for posting on Instagram. The time and effort that goes into the production of these images (including costume, hair and make-up) is unlike anything that took place in the age of the analogue photo album, and is only explicable in terms of the feverish hunt for online reaction. The status of photography in everyday life has undergone a profound transformation as a result of this reciprocal interaction between photographer and viewer. In turn, the design of public landmarks and home interiors has changed, with the aim of producing more engaging visual content.

Think also of the ceaseless emails that now follow online purchases, as companies demand to know about our ‘experience’ of having a parcel shoved through our letterbox. The more usual term for what is being sought here is not ‘reaction’, but ‘feedback’. This term was popularised by cybernetics, an interdisciplinary field that emerged from the Second World War, and brought together psychiatrists, computer scientists and biologists in a collective effort to understand the way organisms – and potentially machines – constantly adapt to a changing environment. From the cybernetic perspective, humans – like animals and all complex systems – pursue their objectives by constantly adjusting their behaviour in response to ‘feedback’. Reaction becomes a two-way street, as new information is constantly processed and behaviour adjusted in response. Every animate being reacts to the reactions of those around it. Among other things, this provided a whole new way of understanding and justifying markets: price signals could be understood as feedback to which traders must constantly react. In societies such as ours, the possible reactions of the financial markets hover over every political move, as Truss and Kwarteng were unceremoniously reminded.

The Instagrammer on holiday learns how to increase follower engagement by monitoring the reactions that their images receive; the delivery company keeps track of its logistics (and surveils its drivers) on the basis of the customer feedback it receives; the financial analyst scours their Bloomberg terminal for the all-important price movement. The most important thing about feedback isn’t whether it’s positive or negative, but that you get it in the first place, and sustaining a constant feedback loop requires constant vigilance and work.

As an academic, I know only too well the pains that universities take to get their students to give feedback via the National Student Survey. Negative feedback is a worry, of course, but the real fear is that students won’t take the survey at all: if a university department fails to meet the minimum response threshold, it will disappear from the league tables. Similarly, the online influencer’s fear isn’t negative reactions, but that ‘engagement’ drops. In a cybernetic context, the individual or organisation that receives no feedback has ceased to change or evolve, and is to all intents and purposes dead.

The term ‘cybernetics’ derives from the Greek kybernetes, the steersman of a ship. The problem that preoccupied cyberneticians was how complex systems – whether brains, organisations, swarms of insects or computer networks – are brought under control. Work that out, and the next question is how they might be steered towards some kind of goal. Feedback, for cyberneticians, is the information – from the speedometer on a dashboard, for example, or the feeling in the stomach registered by the brain as ‘hunger’ – that tells the steersman to adjust their behaviour in a particular way, the better to reach their destination.

Yet much of the anxiety provoked by today’s reaction economy consists in the possibility that, in our desperate hunt for feedback and our need to give feedback to others, we allow ourselves to be steered in directions we did not consent to, and may not wish to go. This has echoes of the mid-20th-century fears of advertising, PR and propaganda, with the difference that now, in the age of reaction chains, we are drawn towards controversy, absurd public spectacles, endlessly mutating memes, trolling etc. In these showers of feedback, much of the appeal is in the sheer quantity of reaction being circulated. Feedback mechanisms, which the cyberneticians viewed as instruments to achieve autonomy and facilitate navigation, turn out to be a trap.

In​ 2005, the former Arsenal and England footballer Ian Wright was filmed at the old Arsenal stadium at Highbury, as part of an ITV documentary about his life. The narrator talks about a schoolteacher, Sydney Pigden, who had been a father figure to Wright when he was growing up in South London. Wright believed that Mr Pigden had died. He is filmed gazing out over the pitch from the stands, when he turns round suddenly and looks as if he’s seen a ghost: Mr Pigden is standing in front of him. Wright double-takes, then removes his cap in deference. ‘Mr Pigden … you’re alive.’

This clip wasn’t uploaded to YouTube until 2010, where it has since had nearly six million views. There is something irresistible about watching Wright’s face switch from nostalgic contentment as he gazes over his old ground, to shock and then tears as his past appears in front of him. The footage was taken from a TV documentary, but the YouTube-enabled notoriety of the clip links it to one of the most popular genres of online video content, the ‘reaction video’. A seminal event in the emergence of this genre took place in 2006, when a YouTube user uploaded some grainy video footage of two children, Brandon and Rachel, unwrapping a games console on Christmas Day six years earlier. ‘N Sixty Fouuuurr’, nine-year-old Brandon roars. ‘N Sixty Fouurrrrr’. The video, entitled ‘Nintendo Sixty-FOOOOOOOOOOUR’, has had 25 million views.

In the years since then, social media platforms have become awash with content that focuses on impulsive emotional responses – to an unexpected gift, a shocking piece of news, the experience of playing a video game, or listening to a famous song for the first time. YouTube features hundreds of videos of deaf children hearing for the first time thanks to new cochlear implants. Just as holidays are now choreographed for Instagram, big surprises are now carefully staged to generate video content. A former colleague of mine with young children hadn’t seen his parents, who live overseas, for more than two years because of Covid restrictions. When he was finally able to book a trip, he kept it a secret from them, so that he and his family could arrive unexpectedly and capture the grandparents’ reaction on his smartphone for posterity. There are costs to decisions like these. The grandparents were denied the pleasure of having something to look forward to and planning things to do on the visit, not to mention having enough notice to keep their diaries free. But the costs were deemed worthwhile, in order that one magical moment might be captured, then later replayed, and perhaps posted online to be seen by others.

What’s really going on here? How have we come to imbue the split-second emotional response with so much cultural and moral value, to the point where significant moments in our lives can be arranged in pursuit of it? On a purely technological level, it is a function of the vast capacity for surveillance that now exists, thanks to the ubiquity of smartphones and other recording devices. The analogue precedent for reaction videos was Candid Camera, which first aired in America in 1948 and became a massive television hit in the 1950s and 1960s. Candid Camera arranged to put unsuspecting people in absurd situations as they went about their everyday lives, and secretly filmed the results; the climax came when the target was dramatically told ‘You’re on Candid Camera,’ to much shock and hilarity. As Bradley Clissold has argued, the success of Candid Camera can be understood in the context of Cold War anxieties over surveillance, providing a humorous release for the pressure of being or feeling watched that was an unsettling new affect in postwar America.

Charles Darwin’s The Expression of the Emotions in Man and Animals (1872), seen by some as the origin of modern understandings of emotion, included extensive interpretation of emotional reactions on animals’ faces. Darwin was an enthusiastic photographer: the recent invention of the camera enabled him to capture and study fleeting facial expressions in a way that hadn’t previously been possible. The quest for the ‘authentic’, unstaged reaction is accelerated by the drift towards ubiquitous videography. But we also have to reckon with the fact that, in a culture producing a vast over-abundance of ‘content’, snippets showing emotional impulses seem to retain a value little else can match. Across various genres, embodied responses to cultural artefacts carry greater worth than the artefacts themselves. The streaming platform Twitch (which currently has 140 million monthly users worldwide) enables people to watch other people playing computer games. The gamer seeks to expand their audience, so as to win sponsorship and donations, not just by being very good at the games but by displaying an emotionally engaging persona as they play.

Aside from being an elite gamer, the successful Twitch user must be capable of sharing their sense of surprise and excitement as the game unfolds. Similarly, YouTube users will film themselves listening to often famous pop songs for the first time. In one of the most famous examples – the video has had ten million views – twin brothers Tim and Fred Williams listen to Phil Collins’s ‘In the Air Tonight’ for the first time and express their astonishment and delight when the drum break hits four minutes in. The most successful reaction video-makers have a likeable, innocent air; they listen in a spirit of wonder, not unlike a baby hearing something for the first time.

The allure of these videos consists at least partly in the fact that, very often, the viewers will be familiar with the song, and might find it hard to respond to it themselves. The endless replaying of classic pop songs from the second half of the 20th century makes it difficult to retain a sense of a song’s specialness or remember the way it once sounded. The music reaction video provides a tunnel back to a pre-digital era, before the ‘long tail’ of old content was so instantly and freely available. Just as the successful Twitch streamer supplies a vicarious experience of elite-level gaming, the successful music reaction video-maker becomes a conduit for an appreciation of value that has been buried beneath a mountain of Radio Six Music and Spotify playlists. The music reaction video, like vinyl listening parties and ‘anniversary live tours’ of classic albums, attempts to excavate an aesthetic impact that has been smothered by the digital archive.

There’s an ambiguity here as to where the real value lies. Is the reaction testament to the value of the song (or the computer game, the Christmas present or whatever else)? Or is the artefact just an instrument used to provoke the impulsive emotional reaction? The simultaneity of artefact and response suggests a novel synthesis of criticism with behavioural experiment. Something similar goes on in the strain of contemporary cultural documentary-making in which celebrity broadcasters such as Simon Schama or Grayson Perry are filmed gazing at an artefact they admire. The camera often lingers just as long on the presenter’s entranced face as on the object itself, as if the real clues to its value lie not in its form or colour, but in the facial expressions of the person viewing it. A kind of mirroring is enacted between human face and artefact, with each put in service of revealing the other.

The extraordinary growth of reaction content is connected to a much larger transformation in the way we have come to view ourselves and our societies over the past thirty years or so. As the sociologist Nikolas Rose has detailed, the rise and cultural influence of the neurosciences since the early 1990s has brought about subtle but profound transformations in what we consider a human being to be. The psychology and psychoanalysis that originated in the late 19th century were founded on a belief in ‘the mind’ and ‘consciousness’ (concepts that were steadily sidelined by the behaviourist revolutions that followed), but people are now encouraged to understand themselves as neural creatures, possessed of brains whose chemistry and physical make-up shape the way they feel and behave.

Two neuroscientific discoveries in particular have permeated our culture, even if we scarcely notice them. The first is ‘brain plasticity’: the idea that we are physiologically altered by experiences, especially those we have when young. While it would be a stretch to describe videos of people listening to songs for the first time as neurological experiments, the fascination with them stems from a similar philosophical principle, that experiences are like footprints in snow, leaving a physical mark. The naive listener is imprinted by a song, which gives the viewer the chance to witness the song’s impact, unpolluted by previous experience.

The second neuroscientific finding that has implanted itself in the culture is that of ‘mirror neurons’, which have been proposed by some as the physiological basis of empathy. When we witness someone else apparently experiencing an intense emotion or feeling, our own neural circuits are affected in turn. Thus to see someone in acute pain provokes activity in the parts of our own brains that are associated with pain. Reaction videos could be explained in similar terms: excitement, fear, pleasure and surprise can be experienced in small, manageable doses by witnessing them on the face of the other.

The mediation of neural empathy via touchscreen technology allows emotions and moods to travel mimetically and virally. But there is one other interface on which a great deal relies: the human face itself. The quest for authentic joy or shock – or best of all, joy and shock at the same time – which drives reaction content endows the human face with a communicative magic that words cannot match. The changing face of Ian Wright or the baby who hears for the first time is taken as a guarantee of truth in a world of fakery and scripted dialogue.

It is an infernal riddle of digital culture that ‘authenticity’ is constantly breeding its opposite: the ‘spontaneous’ event that proves to be no such thing, the ‘surprise’ that turns out to be staged, the emotional outburst that has been practised. TikTok is awash with apparently ‘authentic’ clips of humorous reactions (often based on pranks), the comments on which are preoccupied with whether or not the interaction is ‘real’. The human face, the standard for emotional truth, is also the basis for emojis and Facebook ‘reactions’, now an entire system of signification capable of conveying considerable meaning, but one from which the promise of authentic or immediate emotion has been lost. Any culture that lavishes praise on ‘authenticity’ to the extent that ours does will be beset by worries regarding ‘fakery’.

This culture​ has generated a distinctive type of celebrity or influencer, who hovers in the ambivalent spaces between critical judgment and behavioural impulse, authenticity and studied performance. Talent shows such as The X Factor and Strictly Come Dancing revolve around the facial reactions of celebrity judges; figures such as Simon Cowell are specialists in the manipulation of an eyebrow or the spontaneous look of surprise. Seasoned characters such as Piers Morgan are cynically aware that what will keep them in the spotlight is the force, distinctiveness and watchability of their knee-jerk responses, which are essentially designed to ignite reaction chains.

We have no term for this type of celebrity or authority, one who successfully maintains an influential public position through a capacity and willingness to react in spectacular ways. The public reactor is in part a descendant of the Greek chorus, which would share the stage with the actors in a play, responding to events as they unfolded. An exaggerated capacity to react has been a significant factor in the fortunes of many unlikely political leaders in recent years. Donald Trump’s affective state is one of seeming constantly on the edge of losing his temper. He appears braced for an angry encounter at any moment, something that has added a sense of danger and excitement to his political career. Boris Johnson, by contrast, always appears to be on the verge of bursting out laughing. As many observers have noted, both men had ample time to practise their ‘authentic’ personae, on reality TV shows, panel shows and elsewhere. The question of what these men are really like seems somehow unanswerable.

Those with a pronounced and visible capacity to be publicly enraged or publicly amused (it is Nigel Farage’s distinction to appear forever angry and amused at the same time) have been central to politics in the last decade, and to the ‘populist’ upheavals that have afflicted liberal democracies. The continually enraged or amused political leader appears to serve as a representative, or emotional prosthesis, for those whose hostility to contemporary politics otherwise has no outlet. Rage and laughter have also acquired important political and critical functions in this digital public sphere, where they animate the denunciation of political and economic systems in a context where the formal or ‘mainstream’ mechanisms of evaluation and judgment have come to seem rotten. Political rage often expresses a refusal to accept an entire regime of fairness: in the case of Trump and his followers, a regime that treats people equally regardless of skin colour and that opens a space for a confrontation with the injustices and violence of the past. Political laughter, on the other hand, is a way of devaluing value systems, trivialising them and exposing their putative absurdities. In conditions of political polarisation, both sides use satire and mockery to deflate the sincerity and righteousness of their opponents. The message of political laughter is that the wrong things are being taken seriously.

Anger and humour are parallel reactions to a world that appears to have lost the capacity to recognise genuine injustice, and has become fixated instead on phoney injustices – a world that is ostensibly diverted by petty offences at the expense of real harms. Yet while anger and humour collide publicly in the pranks and interventions of trolls, they can also reveal the deeply held conviction that justice and injustice have become confused, or even inverted. What is held up as ‘real’ is actually fake, and what is dismissed as ‘fake’ is actually real.

We may not have a precise term for the celebrity reactor, but the lexicon of political ideologies does contain one entry that speaks to my concerns: reactionary. Dating back to the French Revolution, the term has been used to describe those on the right who go beyond the pragmatic, gradualist claims of conservatism in order to reject an entire social order in the name of a lost past. Where the conservative seeks to temper the ambitions of the progressive and to highlight contingent sources of social solidarity (such as religious community or cultural identity), the reactionary seeks a more wholesale reckoning with modernity, angrily stripping away its delusions and falsehoods. For the true reactionary, the establishment (as cherished by conservatives) is too weak, too complacent, to resist the threat of progressives and revolutionaries. A more aggressive right-wing agenda is needed, which would reverse not only the gains of the left, but the long decline of the establishment that opened the door to the left in the first place. For the reactionary, the ordinary conservative has been asleep at the wheel.

Such a politics is only stirred into being thanks to the energies and actions of the left. Indeed, the reactionary shares with the revolutionary a sense of how much is at stake. There is a vitalism at work – perhaps even a proto-cybernetic imaginary – in which the energies and methods of the left are channelled into the right in the form of rage, revitalising something left dormant during long periods of stability. The trajectory is towards fascism, which has historically tended to originate in anti-communism. Part of political reaction is the psychological allure of heroic defeat, in which all the momentum and power is attributed to one’s opponents, yet one somehow clings on, still alive. Today, the reactionary right (think of Suella Braverman) thrives on the fantasy that we have suffered long decades of cultural victories for Marxists, Blairites, wokeism, feminism and critical race studies, forces which have controlled the government, multilateral institutions and above all universities, and that the active resistance has been reduced to a small band of guerrilla fighters, holed up in embattled newspapers and opaquely funded think tanks.

What are we​ to make of a society that invests so much in reactions of one sort or another? Part of the allure of the celebrity reactor or the reaction video derives from a wish that our impulsive reactions should be the ‘right’ ones – that we might avoid the regret of l’esprit de l’escalier. One of the difficulties with anger and humour as modes of denunciation is that these impulses don’t always seize us precisely when we’d like them to, or in response to the right thing. Will Smith seems to wish he could be someone who reacts angrily, rather than humorously, when his wife is the object of a cruel joke. (It’s odd, when one thinks about it, that we speak of people having a ‘good sense of humour’, but not a ‘good sense of anger’, despite the huge charisma we attribute to those who display such a thing.)

The cult of reaction is perhaps a symptom of what the Frankfurt School psychoanalyst Erich Fromm described as the ‘fear of freedom’, which he believed was innate in human psychology. As Fromm wrote in his 1941 Fear of Freedom:

From the beginning of his existence man is confronted with the choice between different courses of action. In the animal there is an uninterrupted chain of reactions starting with a stimulus, like hunger, and ending with a more or less strictly determined course of action, which does away with the tension created by the stimulus. In man that chain is interrupted. The stimulus is there but the kind of satisfaction is ‘open’ … It dawns upon him that his is a tragic fate: to be part of nature, and yet to transcend it.

What provoked Fromm’s analysis was the authoritarianism from which he, as a German Jew, had fled. Submission to a leader and submergence within a crowd were means of escaping the anxiety and loneliness that freedom bestows on all of us. ‘The person who gives up his individual self and becomes an automaton,’ Fromm wrote, ‘need not feel alone and anxious any more.’

Pessimism is understandable coming from an exile writing during the Second World War, but that’s a far remove from the world of Twitch gamers or music reaction videos. Even so, the behaviourist turn and our crude embrace of a neuroscientific imaginary clearly signals an attempt to displace a modern idea of human ‘freedom’ with a naturalistic idea of impulse, that is, to insert human society back into the animal world, where feelings of responsibility, anxiety and guilt are absent. Reactionary political rhetoric has more obvious resonances with the phenomena that concerned Fromm: reactionaries’ explicit purpose is to restructure society around ‘natural’ hierarchies of race and gender, and to dehumanise those who, for example, exercise the perilous freedom to cross the English Channel in a small boat.

The political threat lurking here is that certain individuals purport to transcend automatic behavioural chains by sheer force of will. Conservative ideology since the 19th century has also found a place for certain individuals – entrepreneurs, leaders, executives, romantic artists – who have the strength to remain autonomous in a society based on obedience and conformity. Today, a variety of online political influencers, who bring together elements of the business guru, stand-up comedian and professional provocateur, position themselves as originators or leaders to be followed. The success of Jordan Peterson or Andrew Tate consists in the paradox of instructing men to be less obedient, and here’s how.

The idea here is that while everyone (animals included) is capable of reaction, only a rarefied minority is capable of genuine action. Action, from this perspective, means leadership, which in turn implies a far larger quantity of followership. Combating this mentality requires us to think of action democratically, as something made possible by the fact of human plurality. Thus all action is in fact interaction. This is broadly what Hannah Arendt was getting at in The Human Condition (1958), where she defines action as a form of initiation, a giving birth to something, a revealing of oneself to others.

In Arendt’s view, action in the political arena differs from mere social or economic behaviour in being impervious to statistical or probabilistic analysis: ‘The fact that man is capable of action means that the unexpected can be expected from him, that he is able to perform what is infinitely improbable.’ This has an air of wartime existentialism, in which ‘action’ might involve heroic resistance to tyranny and brutality, but Arendt’s intention is to locate this disruptive, world-making kind of freedom squarely in the public sphere, such that day-to-day political participation might become imbued with these forms of responsibility. The challenge of democracy is to ensure that action remains rooted in the egalitarianism of a diverse public sphere, rather than lapsing into cult behaviour, reaction or obedience (whether to the diktats of the leader or the ‘laws’ of economics). For Arendt, this required that the unpredictable, incalculable qualities of politics – as distinct from economics or psychology – be kept in view at all times.

Action implies freedom and it implies uncertainty, neither of which is always easy to live with. Fromm, uncharacteristically for an associate of the Frankfurt School, did offer one note of optimism in his analysis. Individuals could become comfortable with freedom, while avoiding the anxiety and loneliness it often entails, by cultivating their capacity for what he called ‘spontaneity’, something he believed was missing from much of the adult population in the West in the 1930s and 1940s. Spontaneous activity isn’t ‘compulsive’ activity, he noted, but an exercise of one’s whole self, both body and mind. This, he believed, had been largely eroded from Western societies by the demands of corporate capitalism in the 20th century, but it remained something children and artists were capable of, and which we all experience in snatches from time to time. Echoing Freud, Fromm suggested that it is in work and in love that humans encounter their own spontaneity, and where they rediscover a unity between mind, body and world.

Fromm was writing in the mid-20th century, a time when social criticism was preoccupied by the conformity of ‘mass society’ and ‘organisation man’. Today, post-1960s, we pride ourselves on being ‘spontaneous’ and ‘creative’. But we have turned spontaneity into something carefully staged, mass-produced and gawped at. Spontaneity that is too alert to reactions – too motivated by the quest for engagement – isn’t spontaneity at all, but a strategic ‘authenticity’, with the power dynamics of the prank. Political leaders whose charisma lies in seeming oblivious to the consequences of their own actions remind the rest of us that we are haunted by the potential reactions and feedback to our own actions.

Fromm was concerned by the way mid-20th-century educational norms sought to eradicate the ‘spontaneity’ of the child in favour of discipline. Emotional responses were trained out of the child altogether. Today, with so much talk of mental health and ‘wellbeing’, we may believe we’ve moved on from this, while at the same time being mystified as to the reason our society generates such vast anxiety among young people. Children today may not submit to Victorian moral authorities (though the behavioural agenda in many English schools may suggest otherwise), but they are undoubtedly cowed by the authority of reactions. We can blame Instagram, WhatsApp and the possibilities for cyber-bullying afforded by life online, but we should also consider the ways technologies of feedback and reward have become embedded in every moment of a child’s schoolday, thanks especially to the widespread implementation of EdTech.* Spontaneity and action become impossible when they are already overtaken by the range of possible reactions that might ensue, not just from teachers but from the numerous systems that track a child – or an adult for that matter – throughout the day.

Arendt wrote that since political action is a form of radical initiation, of starting again, it is fraught with danger. It can turn out badly. It can disrupt existing routines. We turn away from politics (and towards authoritarianism or, as Arendt also noted, unworldly stoicism) because there is something intimidating about the sheer novelty of which politics is capable. To overcome this obstacle, two things are required. First, people must be capable of making and keeping promises to one another, such that there isn’t a constant threat that everything will start all over again. If organisations and associations are to be sustained over time, the power of political action must be held in check by past commitments. Second, if we are to escape the shadow of the past and genuinely start anew, forgiveness must be possible. Forgiveness, for Arendt, holds a very important role in enabling us to break free of perpetual reaction and counter-reaction.

In contrast to revenge, which is the natural, automatic reaction to transgression and which because of the irreversibility of the action process can be expected and even calculated, the act of forgiving can never be predicted; it is the only reaction that acts in an unexpected way and thus retains, though being a reaction, something of the original character of action.

Forgiveness is unique in being a reaction which breaks the chain, making a fresh start possible. It has been observed, of late, that there is a deficit of forgiveness in ‘cancel culture’ and the reputational attacks that thrive in the reaction economy. But digital platforms are anti-forgiveness machines by design. The broader question is how any of us – but especially children and young people – can become comfortable with our own freedom, our own spontaneity, against the backdrop of surveillance capitalism, which is the real condition of the reaction economy.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address, and a telephone number.

Letters

Vol. 45 No. 6 · 16 March 2023

William Davies mentions Charles Darwin’s The Expression of the Emotions in Man and Animals, the first English-language science book to be illustrated with photographs (LRB, 2 March). But Darwin did not take the photographs used in the book; nor did the photographic technology of the time allow the capture of fleeting facial expressions.

Darwin acquired the photographs for the book from three sources. Some were already on sale to the general public; he got permission to reproduce others from an earlier book by the French physiologist Guillaume-Benjamin-Amand Duchenne de Boulogne; and he commissioned a number of original photographs, mainly from the London-based Swedish photographer Oscar Gustave Rejlander. Duchenne’s images, captured with the help of the pioneer photographer Adrien Tournachon, include several of a toothless old man whose contorted facial expressions were obtained – and frozen long enough to photograph – by applying electrodes to different combinations of muscles in his face. Duchenne claimed the old man had a medical condition that rendered him impervious to pain. Rejlander, having struggled to coax his models into providing and holding realistic expressions, trimmed his magnificent moustache to make his own face easier to read, and provided Darwin with several selfies in hammy, melodramatic poses.

Richard Carter
Hebden Bridge, West Yorkshire

Vol. 45 No. 10 · 18 May 2023

Richard Carter describes Darwin’s The Expression of the Emotions in Man and Animals of 1872 as the ‘first English-language science book to be illustrated with photographs’ (Letters, 16 March). Could I plead the case for Anna Atkins’s Photographs of British Algae: Cyanotype Impressions, published nearly thirty years earlier, in 1843?

An obvious difference is that Darwin’s photographs were black and white, while Atkins’s (which, unlike Darwin, she took herself) were blue and white. The 19th century saw an explosion (sometimes literally) of photographers experimenting with various light-sensitive chemicals. Atkins used cyanotypes, pressing her seaweeds onto paper coated with photosensitive iron salts, which turned blue when exposed to light. (She moved in scientific circles, and learned of cyanotypes from their inventor, her friend John Herschel.) The photographs in British Algae are unique images that were time-consuming to produce: an estimated six thousand cyanotypes were needed for the dozen or so copies of the book, which Atkins published herself.

Thirty years later, printing had progressed. The illustrations in Darwin’s book are heliotypes: cameras were used to produce negatives on papers treated with silver-based salts, and those negatives were then exposed onto printing plates. Cyanotypes fell out of favour, but their beautiful blue colour lingers in the popular imagination: this is where we get the term ‘blueprints’ for exact copies of technical drawings.

John Bothwell
Durham University

send letters to

The Editor
London Review of Books
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address and a telephone number

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences