Steve Jobs 
by Walter Isaacson.
Little, Brown, 630 pp., £25, October 2011, 978 1 4087 0374 8
Show More
Show More

If you want to be loved in America, get rich and make it seem that you got rich doing exactly what you wanted to do and being exactly who you wanted to be. Invent a machine – or better, a series of machines – useful, affordable and attractive enough to be received as a fetish object by your customers. Happy customers will yield happy shareholders, jobs (in retail, if not manufacturing) and economic growth, and your life will be held up as proof that business can be spiritual. This is something America would very much like to believe.

Steve Jobs was born in 1955, on the cusp of the wealthiest years of the wealthiest nation in history. For many years Stanford University and the US military had been pouring financial and intellectual resources into the industrial corridor south of San Francisco Bay, where Jobs grew up, the area that would become Silicon Valley. He was brought up by a car mechanic and a bookkeeper – neither had a college degree – who adopted him shortly after birth. ‘Does that mean your real parents didn’t want you?’ a young neighbour asked him when he was about six. ‘Lightning bolts’ went off in his head, and he nursed a grudge against his unknown birth parents for giving him up.

His anger produced an explosive hypomania evident in his insistence that his parents send him to the extremely expensive Reed College in Portland. He refused to say goodbye to them, or thank you, or to allow them to go with him to the campus they had scrimped to send him to. ‘I wanted to be like an orphan … just arrived out of nowhere,’ he told his biographer. After one term, he realised that he could attend classes and hang out on campus without paying tuition fees, so he dropped out. Two terms later, he moved back to his parents’ house in California. The region’s economy was booming, and despite his long hair and general unwashedness he soon found a job as a $5-an-hour technician at Atari.

Jobs’s rudeness to his parents set a pattern. For the rest of his life he would make extreme and often arbitrary demands of everyone around him, and most would quickly yield to his will. One of the few who did not is his authorised biographer, Walter Isaacson, a former managing editor of Time. Isaacson’s book is packaged as a eulogy, with lots of family photos and a plain white cover whose Helvetican simplicity is characteristic of Apple’s own designers in Cupertino. The black and white photo on the cover is the one posted on Apple’s global website a few hours after Jobs’s death. But there is little sycophancy in the text. Isaacson ruthlessly catalogues the shortcomings of a monomaniac whose success allowed him to get away with all kinds of more or less sociopathic behaviour. Jobs had so much capital and technological firepower at his disposal that it sometimes seemed reality too bent to his will. Isaacson calls this phenomenon Jobs’s ‘reality distortion field’, an inability to accept or even acknowledge facts contrary to those he wanted to see.

He belonged to a postwar generation that sought enlightenment and revolution – hoping, in his words, to ‘put a dent in the universe’ – and satisfied both ambitions within the bounds of capitalism. Isaacson credits him with having ‘revolutionised’ six or seven industries: ‘personal computers, animated movies, music, phones, tablet computing’, along with digital publishing and perhaps retail stores as well. The company that Jobs and a classmate started in his garage 35 years ago is now worth more than $350 billion. You can’t walk into a café or subway carriage or airport lounge in the developed world without seeing Jobs’s impact, in the form of hundreds of hours of human attention flowing through tens of thousands of dollars’ worth of devices bearing the Apple logo. This techno-liberation comes at a price: £1500 for an average MacBook with five or six years of useful life; roughly £1000 for two years of iPhone service. One measure of Jobs’s achievement is the money Apple earns from millions of individual consumers, revenue that has continued to increase while the global economy stagnates. Isaacson lists some of the ways in which our lives are shaped by minor design decisions Jobs made two or three decades ago: rectangles with rounded corners, draggable windows, folders within folders, files that open with a double click.

Apple’s early competitors (IBM, Xerox, Microsoft, Hewlett-Packard) were mostly run by hard-science types, executives with backgrounds in programming and engineering, joined by salesmen as the companies grew. This group was able to interact well enough with the corporate bureaucracies responsible for the bulk of computer sales in the 1980s and early 1990s. Today Apple is one of an emerging Big Four that controls the technology business. Two of them, Google and Facebook, remain under the control of the programmers who created their original products. (The third, Amazon, is headed by a computer scientist who spent his early career on Wall Street.) The founders of Google and Facebook have little skill at handling people, a weakness that shows in their products. Google packages its indispensable services in forgettable generic designs, some of which are tweaked on the basis of user data collected with eyeball-trackers. Facebook, which is based on the sharing of information, is continually apologising for undermining its users’ trust. Like their founders, Google and Facebook have trouble grasping the differences between people and machines.

Apple is the only major technology company for which technology is a slave to taste. Over and over again, Jobs made large bets on his own ability to tell the market what it ought to want, and the market obeyed. This is part of what made Apple different: the wild innovations of companies like Lockheed or Bell Labs married to the mass market design sensibility of Ikea or Topshop. Standing in between was Jobs. Often the only thing he brought to the table was his own perfectionism and vanity. Isaacson records a representative exchange between Jobs and James Vincent of the advertising firm TBWA, which helped come up with Apple’s ‘Think Different’ campaign. Years later, Vincent was Jobs’s contact in the team hired to produce ads for the iPad’s launch:

JOBS: Your commercials suck … you’ve given me small shit.
VINCENT: Well, what do you want? You’ve not been able to tell me what you want.
JOBS: I don’t know. You have to bring me something new.
VINCENT: You’ve got to tell me what you want.
JOBS: You’ve got to show me some stuff, and I’ll know it when I see it.

Exchanges like this appear to have been the essence of Jobs’s genius: he told talented people that their work was shit until they came up with something good enough for him to take credit for. The conversation with Vincent soon degenerated into a screaming match, but in the end TBWA came up with something Jobs liked, resulting in another successful Apple product launch. Isaacson records countless instances of Jobs’s ‘binary’ thinking: people were ‘gods’ or ‘bozos’ and their ideas were ‘amazing’ or ‘shit’. These weren’t just things he said to his employees: he said them to waitresses, hotel clerks and shop assistants. Despite his sophisticated deal-making and relentless focus on the quality of his products, Isaacson’s biography suggests that he spent most of his life behaving like a three-year-old.

Jobs made his start with the help of his friend Steve Wozniak, a brilliant naïf who at the age of 13 had built a machine that could add and subtract binary numbers. They met at an electronics class taught by a former navy pilot at Homestead High School. Their first joint venture was selling ‘blue boxes’, homemade consoles that emitted a series of tones to deceive the telephone network into allowing free calls – the sort of illicit tampering with products that Jobs would later do everything he could to prevent. Isaacson positions the pair between the 1970s counterculture and the do-it-yourself movements, as the search for new forms of consciousness shifted from drugs to technology. Jobs soon apprenticed himself to Larry Lang, a neighbour who worked as an electronics engineer. Lang introduced him to ham radio kits and the Hewlett-Packard Explorers Club, which gave him access to HP’s engineers and labs. A talented hustler, he marked up junked components to sell and impersonated a manufacturer over the phone to get free parts.

By the time he finished high school, Jobs had considerable expertise in electronics, especially in the rapidly evolving hobbyist market. But Wozniak was Apple’s Alexander Graham Bell. In 1975, after Jobs came home from Reed, Wozniak built what some people think of as the first personal computer: the Apple I. He initially wanted to give it away to fellow hobbyists, but Jobs persuaded him to make the venture a business, with Wozniak handling the engineering and Jobs in charge of admistration, design and sales. Circuitboards for the first 50 Apple I units were put together by hand in Jobs’s garage with the help of his ex-girlfriend and pregnant sister. After the Apple I was released, Wozniak’s father told Jobs: ‘You don’t deserve shit. You haven’t produced anything.’ Jobs burst into tears, but Apple kept going. The pair set to work designing the Apple II. Unlike the Apple I, which was aimed at geeks with a knack for home assembly, the Apple II was a single object, with built-in software, keyboard and power supply. Commodore, Tandy and other firms were also trying to produce computers for a broad consumer market, but this was the first time anyone had done it right. Apple received three hundred orders for the Apple II at its launch in San Francisco in 1977. By the end of 1980, the Apple II was selling more than 200,000 units a year, Apple had gone public, and Jobs, at 25, was worth $256 million. Wozniak, whose role in the company had already diminished considerably, pulled back even further after a plane crash in 1981. Before leaving, Wozniak gave away portions of his Apple stock to founding employees whom Jobs had left out of the company’s IPO. These included Daniel Kottke – Jobs’s ‘soulmate’, Isaacson writes – who had followed him from Reed to the Valley and had helped assemble the first 50 Apple I circuitboards in Jobs’s garage. As an hourly employee, he wasn’t entitled to stock options.

For a few years Jobs and Apple continued to rise in tandem, and in 1984 the Macintosh was launched, with an iconic ad in which a female athlete throws a sledgehammer to shatter a huge screen showing a yammering Big Brother (it was first screened during the Super Bowl). Isaacson repeatedly refers to Jobs’s ‘unblinking stare’ or ‘mesmerising stare’ – one former employee compares it to Rasputin’s. Small meetings were dominated by his stares and his silences: a skill he had honed in college, modelling himself on his classmate Robert Friedland, who transferred to Reed from Bowdoin in Maine after being caught with 24,000 tabs of LSD and sentenced to two years in prison. For a few years Jobs, who was already meditating and taking hallucinogens, ‘treated him almost like a guru’. Friedland introduced him to Eastern spirituality. He claims he briefly employed Jobs as foreman for ‘a crew of freaks’ at his New Age apple-farming commune – which may have been the inspiration for Apple’s name. Friedland went on to make billions in mining overseas. Jobs dismissed him to Isaacson as a ‘con man’, but their careers – from self-discovery to charismatic leadership to great wealth – bear out the similarities between them that friends noticed at Reed.

Before the string of product hits that began in 1998 – iMac, iPod, iTunes, iPhone, iPad – Jobs had a few lost years. In 1983, he recruited John Sculley, president of Pepsi-Cola, as Apple’s CEO. ‘Do you want to spend the rest of your life selling sugared water,’ he asked Sculley, ‘or do you want a chance to change the world?’ Isaacson dismisses Sculley, claiming he perceived ‘in Jobs qualities that he fancied in himself’. Their relationship fell apart in 1985, when Jobs launched an unsuccessful putsch, found himself relegated to a lame-duck post as Apple’s chairman and left for a new venture called NeXT.

Jobs’s plan for NeXT was to market powerful research computers for universities. The company foundered partly because of production delays, some of them brought about by Jobs’s perfectionism. Jobs demanded that the chassis of the first NeXT workstation be a perfect, seamless cube and spent a then unprecedented $100,000 on the design of the company’s logo. He had more immediate success with his purchase of Pixar from George Lucas. Pixar, which had begun as just another hardware and software company, evolved into a producer of blockbuster animated films. It was acquired in 2006 by Disney for $7.4 billion in stock, giving Jobs a return of more than a thousand times his initial investment. In 1996, Apple acquired NeXT and brought Jobs back. Jobs’s sobering experience, it seems, had tamed the impulsiveness that had, for example, led him suddenly to propose doing away with the name Macintosh in favour of ‘Bicycle’.

From the Apple II through to the iPad, what made Apple’s business model distinct from its competitors was the coupling of hardware and software, both of them proprietary and designed in house. As Isaacson explains, this meant that Jobs sacrificed the market share that comes with an open platform – such as Microsoft’s Windows, which he accused Bill Gates of stealing from the Mac’s interface. But Apple had unrivalled control over the user’s experience, and that translated directly into money. The iPhone, for instance, accounts for only 4 per cent of the mobile industry’s sales by volume but brings in more than half its profits. Having hooked customers on its superior designs, Apple can charge them monopoly prices. Jobs’s high standards aren’t the only reason Apple’s design is so much better than its rivals. The company’s ability to suss out the best new materials and components from East Asian suppliers and then lock them up with exclusivity agreements before they can reach the rest of the market is just as important.

For devoted fans of Apple, the climax of Isaacson’s book will be his brief visit to the design studio in Cupertino. He is vague about what products Apple might have in development but gives a close account of Jobs’s collaboration with Apple’s chief designer, Jony Ive. Like most of Isaacson’s quoted sources, Ive goes to great lengths to frame his boss’s behaviour as part of his genius, but he too admits to feeling wounded when Jobs criticises an idea only to take credit for it a few days later.

Jobs’s personality might be best explained by the Apple slogan Isaacson uses as the book’s epigraph: ‘The people who are crazy enough to think they can change the world are the ones who do.’ These words might have sounded earth-shattering to computer buyers in 1997, when they were spoken in TV ads over photos of Einstein and Gandhi, but they repeat dogmas that fill American get-rich texts from Think and Grow Rich to The Power of Positive Thinking to The Secret. At their heart is a dime-store Hegelianism, the notion that objective reality will conform to our desires if we want what we want with sufficient force and get rid of anyone who gets in our way.

In business, the intensity of Jobs’s reality distortion was usually an asset. It allowed him to force those around him to internalise his impossibly high standards. Apple did not come up with the idea for a pocket digital music player or a tablet PC, but it was able to bring these concepts to market faster than any of its competitors, with superior levels of quality. Jobs’s attempts to censor reality were less successful in the rest of his life. He tried to deny paternity of the daughter he fathered at the age of 23, and was careful to settle with her mother before Apple’s IPO. More significantly for Apple’s shareholders, he spent nine months after being diagnosed with cancer in 2003 attempting to cure himself with faddish diets before coming round to his doctors’ view that he needed surgery.

His illness gave Jobs the perspective he needed to write his stirring 2005 Stanford commencement address. The speech is that of a man grappling with the contradiction between his apparent omnipotence and the fact that he is soon going to die. ‘Your time is limited,’ Jobs said, ‘so don’t waste it living someone else’s life. Don’t be trapped by dogma – which is living with the results of other people’s thinking.’ He demonstrated the importance of experimentation, failure and the pursuit of whatever seems interesting with stories from his own life. Anyone who reads the speech will understand why so many look at Jobs and see something transcendent. Even Isaacson, who spends so much of this biography as a detached observer, falls into eulogy in the book’s final pages, where he praises the ‘poetry’ and ‘artistry’ of Apple’s products, and says that using them ‘could be as sublime as walking in one of the Zen gardens of Kyoto that Jobs loved’. The comparison between an Apple design and a Zen garden laid out hundreds of years ago might be plausible today, at least for the diehards who post videos of themselves ‘unboxing’ Apple’s latest gadget. But give it a few years. In 2020, making a video call on an iPad will feel about as sublime as booting up an Apple II does now, while a walk through the gardens of Kyoto will feel much as it did in 1920, 1820 and 1720. Jobs’s achievement was to take ephemeral machines and make them seem permanent.

Send Letters To:

The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN

letters@lrb.co.uk

Please include name, address, and a telephone number.

Read anywhere with the London Review of Books app, available now from the App Store for Apple devices, Google Play for Android devices and Amazon for your Kindle Fire.

Sign up to our newsletter

For highlights from the latest issue, our archive and the blog, as well as news, events and exclusive promotions.

Newsletter Preferences