See you in court, pal
- The Nudist on the Late Shift by Po Bronson
Secker, 248 pp, £10.00, August 1999, ISBN 0 436 20477 0
- Infinite Loop: How Apple, the World’s Most Insanely Great Computer Company, Went Insane by Michael Malone
Aurum, 598 pp, £18.99, April 1999, ISBN 1 85410 638 4
- Burn Rate: How I Survived the Gold Rush Years on the Internet by Michael Woolf
Orion, 364 pp, £7.99, June 1999, ISBN 0 7528 2606 9
- The Cathedral and the Bazaar: revised edition by Eric S. Raymond
O'Reilly, 256 pp, £11.95, February 2001, ISBN 0 596 00108 8
There are people who use computers. That, in the context of LRB readers and contributors, is most of us. Above them on the informational equivalent of the Great Chain of Being are the people who know about computers: the people who can tell us how to stick an unbent paper-clip into the hole above a wonky disc drive to make the floppy pop out etc. All of these people are now on the Internet. Above them are the bona fide geeks, who are either people who have things professionally to do with computers, or who are far-gone in hobbydom. These people can write code (which is geekspeak for ‘write computer programs’), mark up HTML to create web pages, and know not only what’s going on, but also what’s about to go on. Above them are the übergeeks, the illuminati of the digital revolution: the kind of people, to use one example from Po Bronson’s entertaining Silicon Valley collage The Nudist on the Late Shift, who reprogram their BMW’s chips to make the car 40 per cent more powerful, the kind of people who, in computer terms, can routinely achieve the impossible.
Bear in mind that even in this group there are sharp differences in ability. As Robert X. Cringely, a (pseudonymous) commentator on the computer business who in 1991 published Accidental Empires, the first and still the best book on the growth of the industry, explains:
at the extreme edge of the normal distribution, there are programmers who are 100 times more productive than the average programmer simply on the basis of the number of lines of computer code they can write in a given period of time. Going a bit further, since some programmers are so accomplished that their programming feats are beyond the ability of their peers, we might say that they are infinitely more productive for really creative, leading edge projects.
Infinitely more productive. It’s hard to think of another field of human endeavour in which you can say that about the difference between the extraordinarily gifted, and the next level up.
And then, above the übergeeks, in the pure empyrean of ultimate geekdom, is the cynosure, the observed of all observers, owner of arguably the best-known entirely plain face in the world, the richest man on the planet: Bill Gates III. According to Cringely, top-level computer jocks are always, without exception, either hippies or nerds; the lank-haired, anal-retentive, oversized-glasses-wearing, non-fantastically-good-at-teeth-brushing Gates is the apotheosis of the nerd type. In the country which gave us the rejoinder ‘if you’re so smart, why ain’t you rich,’ he has revolutionised the status of the nerd. Estimates of his wealth vary. His share of the company he co-founded, Microsoft, was in the middle of this summer worth $72 billion; the Internet’s ‘Bill Gates Personal Wealth Clock’ puts the figure of his net worth at $108 billion. It’s worth looking at that written out: $108,000,000,000. Even taking the lower of those figures as the base, if Microsoft continues to grow at the rate it has hitherto, Gates will in 2004 become the world’s first trillionaire. That means he will be worth a thousand billionaires. As John Allen Paulos demonstrated in his book Innumeracy, most of us have a poor grasp of what numbers on this scale mean; so take a second to guess, intuitively, what you think the difference in time is between a million seconds and a billion seconds. Ready? A million seconds is 11 days; a billion is 32 years. A trillion is 32,000 years.
Microsoft is so important to the American economy, and therefore to the whole world, that any news about it tends to appear on the front pages. In August we learned that Gates had just given away $15 billion in Microsoft stock to his charitable foundation, thus instantly making it the second biggest in the world. (The biggest is the Wellcome Foundation, at $19 billion.) This is a tempting way into the Microsoft story, since it takes us to the subject of the anti-trust trial which has been launched against the company by the American Government. The Department of Justice is contending that Microsoft is a monopolist, using its domination of the operating systems market – that’s the ubiquitous Windows – to bully companies into using other of its products, at the expense of its competitors. The trial could in theory lead to the break-up of Microsoft, just as an anti-trust trial led to the break-up of Standard Oil, the progenitor of all the modern American oil firms. During that earlier trial the head honcho of Standard Oil, John D. Rockefeller, made a regular point of giving enormous amounts of money to his personal charitable foundation, by way of ameliorating his reputation as a robber baron. Gates is a keen student of history, especially American industrial history. Hmmm. Still, deeds should be separated from motives, in the field of philanthropy maybe more than anywhere else.
So perhaps the most promising front-page lead in the last few weeks has been that provided by the news of a hackers’ attack on the Microsoft e-mail service, Hotmail. The early story of this company is told by Po Bronson. Two young would-be entrepreneurs, Jack Smith and Sabeer Bahtia, ‘had been brainstorming possible business ideas for a few months’. One of their problems was not being able to exchange information over e-mail while at their respective workplaces, because they didn’t want their bosses to find out they had been moonlighting. Both had work-based e-mail, which as John Sutherland pointed out in the LRB recently, is notoriously prone to employer surveillance; both also had AOL (America Online) accounts, but these weren’t accessible via their respective office computers. Then one day, on the way home, Smith called Bahtia with an idea so compelling that Bahtia’s first words were: ‘Oh my! Hang up that cellular and call me back on a secure line when you get to your house!’
The idea was for a free anonymous web-based e-mail service. In other words, a service that would allow e-mails to be stored and sent over the World Wide Web rather than via a specific Internet access provider, so that customers would be able to send and receive e-mails in complete confidence from anywhere in the world they could find a computer terminal. It would be like having a PO box number that travelled everywhere you went, which could be accessed instantaneously. This service’s revenue would come from advertising, and it would be self-evidently useful to businessmen, travellers, adulterers, cheapskates – pretty much everybody, in short.
The first emblematic thing about this notion is that it was based on an idea pure and simple, a piece of intellectual property which popped into existence ex nihilo. As Bronson says, ‘any disgruntled employee worried about an employer reading his e-mail could have had the idea.’ The idea was so powerful that the new company grew its customer base faster than any media enterprise in history: within 30 months Hotmail – for this is they – had 25 million users, signing them up at a rate of 25,000 per day. Secondly, the idea was self-referential, even Post-Modern, in that it was based on the difficulty the two hotshots were having in developing an idea. Third, Hotmail invented the contemporary phenomenon of ‘viral marketing’, whereby the product is its own advertisement – so that an e-mail sent from a Hotmail account is in itself an ad for Hotmail. (The first few users found Hotmail by themselves on the day of its launch, 4 July 1996. By the end of one hour 100 were using it; 200 more joined in the next hour; 250 in the third. It was two years before Hotmail needed to spend money on marketing.) Third, the idea, like so many hot new notions in cyberspace, ended with the people who had it being bought out by Microsoft. The price was a wallet-thickening $400 million.
When Hotmail came under attack last month, with the news that a group of computer hackers had found a way to break into any and all of its customers’ accounts, most non-geeks – at least, non-geeks who don’t use Hotmail – would have seen the story as yet another niggling item about Internet security. (A non-issue, many geeks argue, since the Net is much more secure than the post or the telephone.) But the story has more sting to it than that, since Hotmail isn’t any old Internet service, but one whose existence is predicated on the need for absolute security. In other words, this was a damaging act with an ad hominem feel; and the people it was meant to damage were the owners of Hotmail, Microsoft. This is the last and main reason why the Hotmail story is emblematic: because it shows the burningly intense, personal hatred in which many people – especially geeks – hold the world’s most successful company. A tempting place to begin … But the best place to begin the story of Microsoft is probably at the beginning.
Most people, including most people who know a bit about computers – though not most geeks, the next level up – think of the personal computer as a big machine which gradually grew smaller and smaller until we could all fit one on our desks. Thus, thirty years ago, when men flew to the Moon, computers took up whole rooms, and had flashing lights and whirling tape thingies on the outside. Clever people then somehow shrank everything, and the PC industry was born. Not so. Big, ‘mainframe’ computing was and remains a different type of business. The PC came from the opposite direction, as a small thing that gradually got bigger and bigger, more and more powerful: it began life in the semiconductor industry, which in turn grew out of the transistor industry. In 1971, a scientist called Ted Hoff, working at a now world-famous company called Intel, invented the first microprocessor, a single silicon chip – a piece of silicon, on which fine lines of silicon oxide were printed in a photographic process. The microprocessor was effectively all the working bits you needed to make a computer on a single chip, and despite, or more accurately, because of that, the big computer companies didn’t want to know. They couldn’t or wouldn’t know what Robert Moore, one of the co-founders of Intel, had already shown, which is that the density of circuits on the chips would double every 18 months. That axiom, enshrined as ‘Moore’s Law’, has held good throughout the three decades of the microprocessor’s existence. The resulting growth in computational power is hard to grasp. In the course of my writing this piece my computer has performed more calculations than have been done by hand in the whole of human history.
It took nearly four years for someone to base a machine on the new invention. The January 1975 issue of the magazine Popular Electronics published a piece about a new microcomputer called the Altair 8800. Walking across Harvard Yard in December 1974, a young geek called Paul Allen – a quarter of a century before his newsmaking dates with the freshly de-Jaggered Jerry Hall – waved a copy of the magazine in the face of his good buddy, 19-year-old sophomore Bill Gates. Gates was the son of a big-shot Seattle media lawyer, Bill Gates II. (His father, who now runs Bill Gates III’s charitable foundation, has said that if he lived his life over again he would call his son something else. He has tired of telling people his name and being told ‘yeah, right.’ In the family, the Microsoft tycoon is known as ‘Trey’.) The young Gates wrote a scheduling program for his school at the age of 12, and a traffic-logging system for Bellevue, Washington at the age of 16. He was a 100 per cent, bottled-at-the-place-of-origin geek, and he saw the implications of the new technology so quickly that his first thought was he and Allen might be too late. ‘We realised that the revolution might happen without us,’ he has said. ‘After we saw that article, there was no question of where our life would focus.’ Gates and Allen soon started a company whose stated objective was to put ‘a computer on every desk and in every home, running Microsoft software’. At the time, that seemed like a joke.
The fox knows many things; the hedgehog knows one big thing; the 800-pound gorilla doesn’t give a shit what anybody knows. Gates’s great strength was in combining all these attributes. He was fox-like in his omniscience about the details of computing, and his ability to write code, and to supervise other people who wrote it. (Since the Altair 8800 didn’t exist when he and Allen saw the Popular Electronics piece, the first thing they did was write a program for a larger computer duplicating the workings of the non-existent machine, so they could then write software for it.) He was hedgehog-like in his grasp of the single biggest fact about the – to make the point again – as yet non-existent personal computer industry. This was, and is, that it is dominated not by hardware but by software: it is the stuff you put into your computer to make it do things which matters, not the computer itself. This, in hindsight, is such a glaringly obvious no-brainer of a self-evident truth that we need to remind ourselves about the fact that no one saw it that way. At the time, it was perfectly clear to everybody that the real money was in the machines. Software was just stuff run up by smelly hobbyists. Finally, Gates was the 800-pound gorilla in his determination to dominate the industry, irrespective of any opposition. He saw all interactions in terms of winning and losing, and he was determined always to win. The demonic competitiveness was a symptom of a disconcerting form of megalomania: Gates didn’t want to own the world, he assumed that he already did.
The central principle in Gates’s business career has been the insight that the real money in computing comes by owning de facto standards. That’s how you get to be the gorilla – by owning the software which everybody uses, or at the very least refers to as a benchmark, whether they want to or not. His big early break was in doing a deal with IBM, the giant company which dominated the mainframe and business computer market and which finally, against many of its own instincts and institutional pressures, went into the personal computer market in late 1981. The IBM personal computer came to set the standard for the PC industry; hereafter, PC meant IBM-compatible PC. Up until this point, Microsoft wasn’t in operating systems. Its business had been in writing programming languages, such as Basic, Cobol, Fortran and C. Nonetheless, Gates saw the importance of what the IBM PC could become, and did a deal to provide the operating system for the machine. As for the operating system itself, that is, er, a hotly contested point. Basically, Gates bought it from someone else, a local firm called Seattle Computer Products who had already rustled up a program called QDOS, short for Quick and Dirty Operating System. A quick wash and brush-up later, and QDOS became MS-DOS, the operating system which is still running on every single computer on the planet which uses Microsoft software – that’s to say, over 90 per cent of them. It is DOS, lying underneath Windows, which helps to make the ubiquitous Windows so unloved. (One geek name for the stuff Microsoft makes is ‘crapware’. The next release of Windows, Windows 2000, will finally break with the underlying DOS architecture. It should be a much less bad operating system, but because it will not be ‘backwards compatible’ with the old software, it will also royally piss off a great number of reluctant Windows loyalists.)
The IBM PC did well, but not freakishly so, at first. More or less overnight, it set a standard in the fledgling industry – and therefore so did Microsoft – but the machines did not fly out of shops on their own. All of the things you could do on it were versions of things which could be done on a bigger computer – complex maths, word-processing, organising your files. For the PC to take off, it took the invention of an irresistibly compelling application, a ‘killer app’, which would make everyone who saw the program foam at the mouth with envy until they had it set up on their own machines. That killer app arrived in January 1983, and it was called Lotus 1-2-3.
The killer app was a spreadsheet. This is a kind of software which allows the user to enter a range of numbers, connect them into a series of calculations, and then fiddle about with them. In other words, it lets you crunch data in new ways; calculations which would have taken days can now be done in seconds – in particular, calculations of a type which involve finely tweaking figures. ‘What if we put in 6 per cent for inflation, 20 per cent for annual growth and 10 per cent for growth in costs; how does that look?’ Those sorts of fiddle-and-jiggle calculation were suddenly made easy by the spreadsheet, which only existed on the PC. The combination of a new type of software running on a new type of hardware made for explosive growth: PC sales began to rocket, and haven’t yet started to slow down.
So what did Gates do? Having gone into the applications language business to create software for the Altair 8800, and the operating systems business to create software for the IBM PC, he now went into the applications business to create software for, well, everyone. Microsoft followed what has become its standard procedure: lag behind as a new market comes into existence; throw all its weight behind catching up and duplicating the first-comer; use the enormous leverage of the operating system monopoly to have its new software adopted by everyone; and bingo, celebrate the creation of a new de facto standard. Microsoft did it with its programs Excel (the spreadsheet), Word, File, and then with Office, the suite of software, including all the above, which now utterly dominates the PC industry. Above all, it did it with Windows, which copied the Apple Macintosh’s desktop-and-files metaphor for the working environment of the PC. Then it won the inevitable huge lawsuit with Apple. When Steve Jobs, the semi-sociopathic visionary who drove Apple, accused Gates of stealing the idea of a GUI (Graphical User Interface, pronounced ‘gooey’), Gates replied that Apple had in turn stolen the idea from Xerox’s research institute, PARC. There was no little truth to this, but Gates’s exact words are still quite something. ‘No, Steve,’ he told Jobs, ‘I think it’s more like we both have this rich neighbour called Xerox, and you broke in to steal the TV set, found I’d been there first, and said: “No fair, I wanted to steal the TV set.”’ The striking thing about this – as Michael Malone points out in his ultra-detailed history of Apple, Infinite Loop – is that there isn’t a shred of truth in the idea that Gates had the GUI first. It was pure gorilla-think. ‘In his peculiar and dangerous manner, Gates didn’t look upon the Mac OS as competition, but as an intruder into a world that was rightfully his.’ Clunky, buggy, crash-prone, counter-intuitive to use, creakily resting on top of its antiquated DOS shell, Windows became the most successfully revenue-generating piece of software in the history of the world.
By the mid-Nineties, with his vision of a Microsoft-using computer on every desk almost a reality, Gates was in a position of near-total domination of his industry. From this perch he made his first big mistake, one which may yet prove, in business terms, fatal. The mistake was one which almost everybody in the business made: they missed the significance of the Internet.
The story is told in Michael Wolff’s entertaining exposé of ‘the Gold Rush years on the Internet’, Burn Rate. (Woolf is an entrepreneur who had a set of hairy experiences trying to go into business as a content provider on the Net.) Nowadays, when everybody and his mum knows that the Internet is the Next Big Thing, when Internet stock flotations routinely make millionaires of all concerned – nowadays, hindsight makes it perfectly clear what was going to happen. At the time, it was a great shock even to the most au courant geek.
Travelling in our time-machines back to the almost inconceivable distance of 1990, we arrive at the crucial moment in the birth of the Net. The National Science Foundation wanted to get out of the expensive business of subsidising the increasingly complex and rapidly growing infrastructure of the academic computing network. The network had grown out of Arpanet, a matrix of computers designed to link university researchers, government defence labs and the military. (The growth of the Net is one of those phenomena equally describable in opposite terms, as a triumph of lavish government subsidy or a great victory for the free market.) The NSF did a deal to wind down its subsidy over two years in return for allowing commercial use of the infrastructure. The domain names .org, .gov and .edu already existed; the deal marked the birth of .com. Nobody, but nobody, foresaw the explosion of network growth that this minor piece of ‘policymaker’s administrative tinkering’ would cause.
‘How many people knew about the Internet in late 1991?’ Woolf wonders. ‘More than five thousand but possibly less than 25,000.’ Through 1992 and 1993, a few online services grew up, companies such as Genie and Compuserve. Some of these companies offered access to the Internet, but more of them did not; they were there to allow customers to talk to each other over their services, not to throw themselves over the Niagara of unregulated sleaze which to their minds was the Internet. At this point, wags were describing the whole phenomenon as ‘CB radio for the Nineties’. As late as 1994 and 1995, most people in the business thought that the Internet was a sleazy distraction from the main business of online services, which was allowing customers of the service to look up information on the proprietary network and to chat to each other (mainly about sex). This is the business that Microsoft was trying to get into, via its MSN network.
One übergeek in particular saw things differently, however. Marc Andreesen was a freakishly gifted programmer – a ‘code god’ – whose interest was in laying out visually the data on Internet pages. He wrote the first program which enabled users to see graphics on, as opposed to just read text on, a web page, called Mosaic, and then set up a company called Netscape with the intention of bringing such a product – now dubbed a ‘web browser’ – to market. In October 1994, Netscape Navigator was launched. Everyone in the computer industry studies Bill Gates closely, and by now his dictum about setting de facto standards was not a well-kept secret. In startling accordance with this axiom, Netscape didn’t try to sell their browser: they gave it away. Anyone could download it for free. At a stroke Netscape achieved instant domination of the Net; quite simply, every single person on it was using Netscape Navigator. Internet use took off exponentially, and even the online services which had defined themselves precisely by not being the Internet started to repackage themselves as Internet access outfits – the most spectacular success in this area being that of AOL, who in the process gave away literally a billion free software CDs.
Netscape’s idea was to focus on what Woolf calls ‘razor blade marketing’ – give away the razor, sell the blades. The blades would be the inevitable software patches and upgrades to come; the all important thing was market share. But Netscape, having made their own luck, also got lucky. When you turned their program on you found yourself taken to their home web page, which therefore immediately found itself to be the most visited page on the Net. This was serious advertising power; before long, the going rate was $.02 per click (i.e. per pair of eyes to see the page).
In 1995, Gates, hitherto preoccupied by the MSN network and the launch of the insanely overhyped Windows 95, finally got it. He wrote a now-famous companywide memo called ‘The Internet Tidal Wave’. ‘I have gone through several stages of increasing my views of its importance. Now, I assign the Internet the highest level.’ (Notice how grudgingly he defers to the existence of an external reality.) Chairman Bill threw the entire weight of Microsoft at the Internet; from now on, Microsoft was to be an Internet company, indeed the Internet company. ‘I just want them to get that we’re hard-core about the Internet,’ Gates was saying, with a note of desperation, by the end of 1995. Gates does such a good job of regarding himself as a sage and genius that one is always reluctant to join in, but it must be said that there is something heroic about the way in which he decided he had been wrong and forced a giant multi-multi-billion dollar company into an overnight reversal of direction. If Microsoft had been a normal company, rather than an absolute monarchy, they could never have done it.
Gates hurled Microsoft at the problem, and Microsoft came up with its own web browser, Internet Explorer. This was the birth of the ‘browser wars’, in which Netscape and Microsoft slugged it out for market share. At the time of writing, a scant four years after starting from nowhere with ‘The Internet Tidal Wave’, Microsoft has just over 50 per cent of the web-browser market. This is no small feat, but they had to give away many millions of copies of software in the process (‘We don’t need to make any revenue from Internet software,’ decreed Chairman Bill), and step very hard on very many toes. In practice, that meant issuing threats – always part of the Microsoft way of doing business.
This is what has landed Microsoft in court, as the subject of an anti-trust suit from the Department of Justice. The company is now in the greatest peril it has ever been in. The gist of the DoJ case is that Microsoft told people in the business that if they didn’t make Internet Explorer their web browser, at the expense of Netscape Navigator, they couldn’t have Windows. Furthermore, by making Explorer the core of the forthcoming launch of Windows 98, they were further extending and exploiting their monopoly power. What 800-pound gorilla would even contemplate doing different?
Federal Judge Thomas Penfield Jackson is showing clear signs of believing the case against Microsoft, helped by their courtroom combination of ultra-aggression and shiftiness, and also by their mountains of self-incriminating e-mails. (The overfrank internal e-mail being one corporate artefact which this trial will make extinct.) Members of the public, according to a Gallup poll, take the other view, only one in four supporting the Government’s case. I have never met a geek who does not believe the charges against Microsoft, but there is a range of opinion as to whether or not they matter, and what should be done about them.
For one thing, monopolies in the software industry are not necessarily such a terrible idea. Look at it another way: how would you create a monopoly if you wanted to? (All cant aside, any rational businessman wants to own a monopoly. What’s the alternative – selling something that somebody else sells too? Which of those seems to you a better idea?) In the software business, you would do it by writing a program which is so useful that everybody who tried it would want other people to use it too; and then the more people who used it the more useful it would be, since files and information would be more easily exchangeable, and the larger the installed base of users the more upgrades, support systems, add-ons and other goodies; until, finally, everybody would be using your product, which would be all the more useful because everybody was using it. Whoops, you’ve become a monopolist – see you in court, pal. The paradox is that software is an industry where a monopoly can, to some extent, be beneficial. Or, as a Microsoft spokesman puts it, gorillaishly but with some truth: ‘The laws exist to protect consumers, not competitors.’
There is also some truth to the central plank of Microsoft’s defence, which is that the computer business is so competitive that the company is at full stretch trying to keep ahead of its rivals – hardly the position that John D. Rockefeller, say, was in. There certainly are a lot of very, very intelligent people out there trying as hard as they can to destroy Microsoft; a state of affairs which is of the company’s own making. One of the most potent threats – in the opinion of many geeks, the single biggest danger in the longish run – is that offered by the very unlikely story of a Finnish übergeek called Linus Torvalds.
In 1991, the 21-year-old Torvalds (whose mother is the six-times winner of the Finnish national karate championships) used his free time to write an operating system, which he dubbed Linux. The system was based on an OS called Unix, a geek favourite since Arpanet days. So far so fairly remarkable; but it’s what Torvalds did next that was really new. He published all the code for his operating system, and invited people to amend and extend it as they wished. He made it into a collaborative project in which everyone could take part, and which would be free to all, thus instantly creating a worldwide community of Linux geeks.
At the moment, about seven million people use Linux. This might sound a smallish number, but the philosophy of ‘open source’ computing, as it’s called, is reaching critical mass in the computer world. An essay on the subject by Eric Raymond, ‘The Cathedral and the Bazaar’, has had enormous influence in disseminating the ideas and values of the open source; it persuaded Netscape to make available the code of its latest version of its software. Torvalds and others argue that free software is almost always better than the stuff you pay for: for one thing, it’s never released before it’s ready, since geek pride is at stake rather than any commercial considerations. For another, it’s constantly being improved.
The threat to people like Microsoft is that Linux will destroy the cash value of the operating system; why pay several hundred dollars for something when you can get an equivalent product for free? This idea doesn’t have to catch on with all that many people before it will do horrible damage to Microsoft’s revenue stream. And because Microsoft has an extraordinarily high price/ earnings ratio, any marked downturn in its income will have a catastrophic effect on its stock. As a result, Microsoft’s attitude to Linux is hilariously two-faced. Most of the time they act as if it’s strictly chickenfeed: who wants to use some cobbled-together piece of software bodged into existence by bug-eyed know-nothing hobbyists? For the purpose of the trial, however, they cite Linux as an example of the scary ultra-competitiveness of the computer business: the kind of industry where a Finnish geek barely out of his rompers can in a few spare hours write a program which threatens the very existence of a trillion-dollar corporation. The joke or irony is that it’s probably this second view which is closer to the truth.
As for the trial, the geek consensus is that nothing will change. This judge will rule against Microsoft, who will appeal, and do much better in the more right-wing appellate court. The Department of Justice will appeal again, and so on. Fantasy scenarios, such as a Federally mandated break-up of the company (to rival those of the oil and phone monopolies, Standard Oil and AT&T), are just that, fantasies. The real deals will happen behind the scenes. Microsoft offered an out-of-court settlement before the trial began, but the DoJ rejected it. At some point they will offer another; then – perhaps when Wall Street has hit its long-overdue speed-bump, and the idea of the Federal Government going after the country’s most successful company is even less popular – they will offer a deal which Justice, as opposed to justice, accepts. The deal will allow the Government to claim victory while leaving all essential points about Microsoft’s business unchanged. The trial(s) will make it harder for Microsoft to threaten people, and they’ll start encrypting their internal e-mails, but that’s about it. The real, long-term threat to Gates and his company comes not from the Department of Justice but from that funny shape on the horizon, a cloud no larger than a Finnish übergeek’s head.