Like the 1950s

Thomas Jones · 'Physics of the Future'

In Understanding Media (1964), Marshall McLuhan distinguished between 'hot' and 'cool' media: hot media, like the radio, are 'high definition' but 'low in participation'; 'cool media', like the telephone, are 'low definition' but 'high in participation'. (In the early 1960s, TV was 'cool', compared to the 'hot' movies. Obviously that was long before the arrival of hi-def.) Predictions about the way technology is heading, whether made by SF writers or tech companies, tend to assume the future will be hot. Characters in Brave New World go to the Feelies. Thirty years ago, everyone (well, maybe not everyone) imagined that by now we'd be watching holographic movies and wandering around with virtual reality helmets on. But no one foresaw the rise of text-messaging or Twitter. Michio Kaku's whiggish Physics of the Future, published last month, follows the trend, confident that the future will be lived in high definition.

The weirdest thing about Kaku's book isn't the 'startling and provocative vision of the future' promised by the publicity material, but how incredibly old-fashioned it all seems. Kaku, a professor of physics at the CUNY Graduate Center, begins the book with a confession of how much he loved watching Flash Gordon on TV as a child. It shows. Laser-propelled spaceships, robot dogs, self-driving magnetic cars, colonies on Mars, surfing the internet telepathically through our contact lenses – weren't we, according to the SF of the 1950s and 1960s, already meant to have all this stuff by now?

Kaku's final chapter, 'A Day in the Life in 2100', reads like Asimov on a very bad day:

January 1, 2100, 6:15 a.m.
After a night of heavy partying on New Year's Eve, you are sound asleep.
Suddenly, your wall screen lights up. A friendly, familiar face appears on the screen. It's Molly, the software program you bought recently. Molly announces cheerily, 'John, wake up. You are needed at the office.'

In many of the important ways, you'll notice, 2100 will be remarkably like the 1950s.

On the other hand, the entire species may be wiped out at some point in the next 89 years, and a day in the life in 2100 will begin more like this:

After a night of heavy sulphuric acid consumption, the colony of extremophile bacteria had doubled in size.

Or maybe not, since apocalypse, like techno-utopia, has a tendency to be permanently postponed.


  • 10 June 2011 at 6:26pm
    BrendanCByrne says:
    While I appreciate the point Mr. Jones is making, perhaps it would best if he amended his characterization of SF to specifically midcentury or prior. Any cursory reading of the cyberpunks, as well as their progeny, would show that there were/are SF writers interested in low def/high participation, many in incredibly prescient ways.

    • 10 June 2011 at 10:19pm
      Good point. I think many post-Gibson SF authors write about 'participating' primarily with media in extremely solipsistic ways because they saw cyberspace as a logical extension of the late 20th Century's erosion of human connection. Gibson's early characters are certainly as cold and hard as their tech. It's also worth asking whether the Facebook, the dominant paradigm, is really all that participatory.

    • 10 June 2011 at 10:21pm
      BrendanCByrne says: @ BrendanCByrne
      I also just referred to it as 'the Facebook'. Which leaves me kind of speechless.

    • 13 June 2011 at 12:37pm
      But is the cyberspace of Neuromancer equivalent to the web? I don't think so.

      Cyberspace was the underlying structures of data that Case experienced, a virtual audio-visual space, capable of being explored and where one's own presence also registered (unless it could be masked). The electrodes on the head being facilitators of exchange in both directions, ie: security programs can 'flat-line' the user, potentially killing them. It remains closer to the representation/conceptualisation engaged in by hackers, rather than users, so the social dimension did not feature. (Anybody who is accustomed to quantitative viewing of data, as 'packets', rather than as content, is familiar with this view of the internet.) Gibson's move was clever, because it turned the hacker's position, potentially, into that of somebody sat in the electric chair, a place where Julian Assange, that 'thief of cyberspace, may, potentially, yet end up.

  • 10 June 2011 at 6:56pm
    alex says:
    People only invent what they can imagine, which is often conditioned by their tendency to orient themselves historically. Italo Calvino has an interesting essay on the cinema where he shows it not to be innovative except insofar as the technology enables the extension of older forms, inherited from the circus, carnival &c.

    • 13 June 2011 at 12:49pm
      pinhut says: @ alex
      This fits with McLuhan's argument that each new media raids previous media forms for its content. However, there is also something essential within each media form, something not capable of being translated from one to the other, and so, cinema is different inasmuch as there will be particular effects/potentialities that are peculiar to it. With cinema, the editing process is materially different to that of a circus, carnival, photography, etc, so, in this aspect of cinema, there emerged a whole range of strategies / experiments to exploit this potential, the shower scene in Psycho being a famous example, because it is not just about what cinema can be, but also about how it is not theatre, photography, literature, etc.

    • 13 June 2011 at 1:33pm
      alex says: @ pinhut
      i can agree with a lot of this. and media are self-referential, a film will be about itself and its formal/technical qualities. But
      I'm still amazed reading sections of description especially in fiction, e.g. Balzac, Zola or Conrad, the extent to which they are 'cinematic'. Film tried to supply the plenitude perceived to be lacking in literary descripton, but the latter functioning as it did on synecdoche and suggestion often did a better job of making the reader 'see' scenes. (I'm thinking especially of the rendering of cityscapes, interiors, material culture aspects.)

    • 13 June 2011 at 2:40pm
      pinhut says: @ alex
      I think that, rather than describe literary works that precede cinema as cinematic, I'd just call them 'psychological' - both forms rest upon the psyche, and cinema is adept at reproducing aspects of the psyche already witnessed elsewhere. One question might be, cinema/television is adept at presenting horror, far more than other media, so it might be argued that the world has become a more horrifying place because of this fact.

      I find the novels of Thomas Bernhard to be the closest thing to experiencing, in art, how the mind functions, that's why I enjoy them so much.

    • 13 June 2011 at 7:37pm
      alex says: @ pinhut
      psychological is a good term, although a narrative, like a film sequence, needn't be a representation of a single mind or point of view (cf. impressionism, as a reaction to photography).
      I've never read Thos. Bernhard. Maybe I will now. Thanks for the tip.

    • 14 June 2011 at 12:10am
      pinhut says: @ alex
      Does 'psychological' limit something to a single mind? My single mind (small as it is) doesn't quite grasp the point.

    • 14 June 2011 at 9:32am
      alex says: @ pinhut
      probably not. the idea of a psyche transcending individuals has been around since ancient times. Maybe - back to the original topic that's what we want technology to do for us, whether it's literary (logotechne, as the Greeks call literature), photographic, digital or fibre-optic.

  • 10 June 2011 at 10:05pm
    loxhore says:
    Of course, apocalypses are not collapses, but few would deny that collapses, in the Diamond sense, can be 'apocalyptic', in the bastardised sense, and really just aren't permanently postponed.

    C'mon, guys, chant it. Chant
    There is nothing ordinary about this at all
    there is nothing ordinary about this at all

Read more