With Silicon Valley going into supernova, eclipsing even Hollywood, New York and Washington, D.C., the media--always acutely aware of where its next ad revenue is coming from--has embraced high tech and Silicon Valley with all the calculating passion of Bill Clinton sizing up a new intern. Fortune did it first and best, when it reportedly informed its advertisers that it intended to have either Bill Gates or Andy Grove in every damn one of its issues in 1997 (God bless editorial integrity!) and then did exactly that, adding legions of tech readers and advertisers.
Everybody else got the message. Forbes devoted several hundred pages of its 80th-anniversary issue to the technology revolution--then blew its credibility by misidentifying David Packard as Bill Hewlett on the cover. Not to be outdone, Business Week not only dedicated a special issue to Silicon Valley, it also gathered every Valley leader it could think of (well, no Jerry Sanders, Wilf Corrigan or T.J. Rodgers--but, hey, ASICs are complicated) and turned them into BW Playmates of the Month with their own foldout cover.
Everywhere you look, tech coverage is ascendant. And it's not just business magazines, either. Every local newspaper seems to have a special computing and technology section, and tech coverage is taking over whole sections of Newsweek and Time--the latter even naming Grove Man of the Year about 10 years too late and for the wrong reason ("Intel is the world's largest producer of computer chips," its editor knowingly intoned).
And this is just the beginning. Entire networks, such as Ziff-Davis' ZDTV, are springing up to give us that dawn-to-dusk technology coverage we all crave. As if that weren't enough, there's always the Web, with its infinite supply of press releases, analysis, gossip masquerading as fact and the endless ravings of the increasingly paranoid vox populi.
And now, in the final accreditation, literary carpetbaggers have descended upon the Valley. You know you own the zeitgeist when New York publishers peer across the Hudson, decide there's something going on out here among us Indians, then send Manhattan writers out to do field studies. Thus, Silicon Valley is overrun with the likes of novelist Po Bronson, Liar's Poker (W.W. Norton & Co., 1989) author Michael Lewis and John Heilemann (he of the dreary New Yorker profiles). These days, a daily part of any Valley CEO's job is to be interviewed by yet another book author. ("So I keep hearing about this company called Fairchild, but it's not in my American Electronics Association directory. Do you know anything about it?")
In the works now are a feature-length documentary, at least one coffee table book, a couple of multipart TV documentaries and, for all we know, a sitcom. The digital revolution is the story of the millennium (or at least millennium's turn). And the revolution's long-term impact will likely be as great, if not greater, than the most wild-eyed claims being made for it. It will undoubtedly change every human institution, tear down almost all our political models, annihilate the traditional nation-state and transform money, commerce, art, literature, entertainment, sports and even organized religion. The digital revolution will have an irrevocable impact on the neighborhood, the family, the self.
Nothing you haven't heard before, right? And somewhere on cable, someone is saying it again right now. This message has become our fin de siècle mantra, a phrase repeated so often we can mutter it without hearing ourselves speak or without pausing for even a millisecond to ponder its monstrous implications.
But wait a moment. If the electronics revolution is, and will be, as sweeping as we're all predicting, shouldn't we be scared out of our minds? And, just as important, shouldn't we be terrified by anyone who isn't frightened?
Why worry? Look at history. We pundits, myself included, like to blather about the many parallels between our current society and the first industrial revolution. We point to the rise of cities, modern science and medicine, mass production, high-speed transportation and communications, longer life expectancies and high literacy rates. All true. There's no question that the industrial revolution, in its first wave from 1795 to 1830 as well as in its second from 1876 to 1900, was one of the most beneficial events in human history.
But what human revolutions give, they also take away. It's polite to ignore the fact that the industrial revolution also killed the Enlightenment, set off the destructive and narcissistic counterforce called romanticism, buried us in soul-killing bureaucracies and, worst of all, gave us machine-age "total war." It's a single developmental thread that runs from Cold Harbor to Verdun to Stalingrad to Hue.
Here at the end of the most homicidal century in human history, the memory of millions of murdered innocents ought to be more than enough to make us wary of all the talk about the New Digital Man, Homo computatis.
People had their own absolutist dreams about the new and perfect human during the industrial revolution, too: Jean-Jacques Rousseau with his divine primitive, Karl Marx with economic man and Friedrich Nietzsche with his Superman. The model was perfected most horribly in this century with the Aryan warrior and the New Soviet Man. Great revolutions seem to provoke absolutist fantasies and delusions of human perfectibility. And hidden behind all the talk of perfecting mankind are the screams of those deemed unworthy.
Here at the end of the millennium we're all so proud of our enlightenment. We walk out of "Amistad" and "Titanic" comforted in the knowledge that, unlike our lesser mortal predecessors--say, George Washington, Thomas Jefferson and Mrs. John Jacob Astor--we would never consider owning slaves or treating the lower classes like subhumans.
But we have our own dirty little prejudice that, like slavery to a South Carolina tobacco grower in 1830, seems so much a part of the natural order that we scarcely notice it, much less feel the need to defend it. You hear it when Bill Gates, to general approbation, says Microsoft Corp. only hires the brightest people. You hear it in the words of the WestTech job fair recruiters, read it between the lines in every book about the new "learning" organization, watch it in the personnel policies of every hot tech company. It's a message that can be distilled into a single warning: Don't be stupid.
That message is lost in all the brouhaha over Richard Herrnstein and Charles Murray's book The Bell Curve: Intelligence and Class Structure in American Life (Free Press, 1994). Everyone started shouting about the brief section on racial intelligence and somehow overlooked the book's larger theme: that our society is dividing along IQ lines. A hundred years ago when cognitive skills weren't as important in everyday small-town life, the local doctor might indeed have married the coffee shop waitress, and the schoolmarm might have wed the honest and hard-working--though slightly dim--automobile mechanic. Not anymore.
Today, electrical engineers marry electrical engineers, stock analysts shack up with venture capitalists and the demarcation between the worlds of the bright and the merely average is as rigid and impenetrable as the "white" and "colored" sections of a Memphis, Tenn., movie theater were in 1935. The only difference is that in 1935, blacks at least had the hope of one day achieving equality. In an age when every organization's catchphrase is "smarter, faster, better," who's going to stand up for the millions--indeed, by definition, the majority--with average intelligence? Wouldn't any such leader automatically be outside the category?
And our IQ bigotry has yet to reach its most virulent form. Today we can only exclude the modestly intelligent from our companies, our neighborhoods and our private schools. But a few years down the road when we have the right diagnostic tools--thank you, Human Genome Project--we'll be able to eliminate this burden altogether by liquidating the sub-brilliant before they're born. And the sooner we get started, the better. After all, the demands of technology move the intelligence bar higher every year.
Heard this kind of absolutist talk before? Of course you have. It's the nightmare obsession of modern life. It pops up, captures our imaginations and our souls, then produces unimaginable horrors. Two million Kulaks, 6 million Jews, 30 million Chinese, half the population of Cambodia. Social Darwinism, Leninism, fascism, Stalinism, Maoism, the Khmer Rouge. Once you establish the perfect New Man, you can't help but stuff imperfect Real Man into him--even if you have to kill him in the process--or, in our more enlightened view at the end of the century, merely redesign him. Restructure his DNA, pop a few slivers of silicon into his cerebral cortex or just mainline his central nervous system right into the worldwide grid. Who needs neuromancy when you can hook right into the Net, become a human browser and act as your own software agent?
If this sounds outrageous, let me remind you of the kind of high-stakes game we're playing. You can't announce a complete, technology-driven revolution in human culture--as we regularly do in this magazine--and then duck the implications of such a profound historical discontinuity. If the industrial revolution gave us longer life expectancies and unprecedented material wealth while at the same time creating a global graveyard, are we so naïve as to believe that the digital revolution won't deliver a similar yin and yang?
Within a generation, there will likely be 5 billion people on the World Wide Web. There will also be perhaps 100 billion embedded controllers tucked away in every corner of the planet--all talking to one another. We'll be using 10-gigahertz PCs with a terabit of memory, 3D displays and 10Mbps modems. We'll consult many times each day with our personal software avatars, which will then race around the Net doing our bidding. We'll witness the arrival of the first biological interfaces to solid-state electronics. And we'll hurtle toward the first microprocessors that include as many transistors as there are neurons in the human brain. Does anybody still believe we will confine these awesome inventions to the office or the den? That we will merely add them to our lives, like cell phones and Walkmans, without profoundly changing everything about who we are and how we live? Most of all, does anyone believe that all these commercial, cultural and personal changes will be strictly salutary?
In fact, a number of people (many of them my friends and colleagues in Silicon Valley) believe precisely that. To them, the electronics revolution is not only inevitable, it is the destiny of the race. Moore's Law is our new Invisible Hand--a market-driven theory of history leading us toward the Valhalla of cultural equilibrium, perpetual innovation, and general enlightenment and prosperity.
For this crowd, the great visionary is George Gilder and his defining work--his Wealth of Nations Road to Serfdom and Das Kapital all rolled into one (no small irony for a legendary conservative)--the book Microcosm (Simon & Schuster, 1989). Gilder is brilliant and passionate, and Microcosm is no different. Most of it is devoted to a superb history of the integrated circuit and the microprocessor, and how these devices changed institutions and the economy. But the last chapter is different. There, Gilder drops all pretense of narrative balance or subtlety and goes for it with everything he's got: Now the chip is not just a landmark invention but a transcendent vehicle for reordering human nature. This is no longer admiration but worship. And coming from a devout Christian, it approached heresy.
At the time the book was first published, Valley leaders jokingly said, "Poor George stared so long at an IC that he saw the face of God." They don't joke about it anymore. In the intervening years, they, too, have been on the road to Damascus and been blinded by the light reflecting off a 12-inch wafer. Like George, they have found redemption in Moore's Law--and they aren't alone. Nobody is immune. Consider the following:
Know who penned that passage? Me. Like I said: No one escapes. I wrote the preceding for the new photography book One Digital Day (Times Books/Random House, 1998), which Fortune (in its zeal to cover more high tech than anybody else) recently made into its cover story--as if it were a 31-page piece of independent reporting rather than a project underwritten by Intel Corp. One Digital Day celebrates a day in the life of the microprocessor the same way its predecessors celebrated the United States, Japan and China. The book's theme is appropriate because the microprocessor is a different country--and only a foolish tourist believes it will be anything like home. Every era has its Big Idea--and no idea has been bigger than that of the Digital World. If you get too close (and who can resist), you will inevitably be drawn into its vortex. Like Gilder, the longer you look at the integrated circuit or the Net or the PC, the more transcendental you become, the more hyperbolic your musings. And these days, we're all looking closely. Technology is the siren's call that just may dash us all on the rocks.
Gilder's Microcosm gave the first public voice to the absolutism that has always been the dark shadow of high tech. But the idea of perfectibility through high tech is as old as the vacuum tube. Seventy-five years ago, Lee De Forest composed goofy manifestos claiming that messy mankind had sullied his invention by using it to broadcast baseball games and "Fibber McGee and Molly," when it was supposed to spread enlightenment and usher in a golden age.
More sinister was William Shockley's involvement in racial politics. Shockley, co-inventor of the transistor, was one of the smartest men who ever lived, but his brilliance only drove him deeper into his obsession with eugenics, most famously with the genius sperm bank. If only, Shockley believed, man could be made as pure and perfect as his technology.
But it was not from the top right but rather the bottom left that the vision of technological absolutism reached full flower. What's rarely mentioned about the Homebrew Computer Club--that mid-1970s phenomenon that gave birth to the PC and the personal computer nerd--is its messianic streak. Steve Wozniak may only have been trying to build a cheap minicomputer, but almost everybody else at the meetings was trying to change the world, not the least of them Steve Jobs. The University of California, Berkeley, contingent, in particular, was forever looking at ways to deliver free hardware and software to the masses, to tear down the old order and bring about the New Age. And when Woz and Jobs weren't Homebrewing, they were hanging out with Captain Crunch, the phone hacker who believed the first step to utopia lay in undermining Ma Bell.
The whole history of Apple Computer Inc., in fact, is one of undying belief--in the face of all kinds of evidence to the contrary (including Apple itself)--in the perfectibility of man through computers. Hence Macolytes' hatred for Gates for cynically destroying that dream. But in his own cold-blooded way, Gates is an absolutist, too. After all, what is his book The Road Ahead (Viking, 1995) other than a paean to the edifying promise of technology? The only difference is that Gates believes paradise will have a Microsoft logo on the door.
But Gates scares us in ways that more frightening personalities like Jobs and Larry Ellison do not. Gates offers us a glimpse of something we all secretly know but are afraid to admit: If a giant global commune of digital men and women is what the absolutists want, Microsoft is an early warning of what they will likely get--technototalitarianism. Not the Eloi but the Morlocks, not the Federation but the Borg. When the Big Brother of the famous 1984 Macintosh ad morphed into Gates on the big screen at Macworld in 1997, a cold wind blew through the computer industry. It was an early warning of the storm to come.
At the 1996 Progress & Freedom Foundation summit in Aspen, Colo., technology pundits--from wild-eyed radicals to sci-fi dreamers, self-proclaimed futurists and cool-eyed capitalists--gathered to discuss the Digital World. In addition to the obligatory preening (and partly because of it), a number of debates ricocheted around the room regarding government's role in the new digital world, personal freedom vs. community needs, profits vs. freeware, etc.--the usual debates between left and right and libertarian that have gone on for generations.
But astute viewers would have noticed something more, something amazing. Beneath the sectarian differences, everybody fundamentally agreed. From conservative free marketers to liberal social activists, everyone in that Aspen hall agreed that the technology revolution was inevitable, irresistible and--once we got past our pesky sectarian differences--promised to be the greatest transformation mankind had ever witnessed. Having accepted that position, it was easy to take it one step further. And although it was Esther Dyson who made the actual proclamation, nearly everyone in attendance shared her attitude. When asked what should be done about the millions of people who refused to join this Brave New Digital World--those silly souls who refuse to buy PCs or surf the Net--Dyson simply replied that they must be made to join us, the enlightened.
Although Dyson may have been half-joking (with Esther, it can be hard to tell), her remark was ghastly nevertheless. Among that crowd, however, the enormity of her utterance went largely unremarked. After all, why would anyone object? If tech is indeed the greatest thing ever, won't it then carry us across the river to the Promised Land? Surely anyone who refuses such a trip would have to be considered confused or delusional--and not to be left to his or her own devices. For their own good, the unbelievers must be forced onto the boat; resistance must be made futile. That, at least, was the message Gates delivered to the federal judge and the Senate committee and, more recently, directly to the Department of Justice: How dare you challenge me?! I'm on the side of the angels, on the train of history. You in government are merely an impediment, an anachronism that doesn't know enough to go off and die.
Dyson's comment has given us a preview of what the future may hold. In recent years, there has been much talk about the fact that traditional political alignments of Republican and Democratic, left and right, are no longer tenable--that some new bipolar alignment will emerge that will more accurately reflect the fears and desires of people living in the new Digital World. You can see the disintegration of traditional boundaries as common cause transforms old enemies into allies.
Thus, under the sheet of technological absolutism we are seeing some strange bedfellows. On the right, among what might be called the technoreactionaries, are Gilder and Forbes publisher Rich Karlgaard, whose Wall Street Journal editorials epitomize the belief that technology will set us free. On the left, among the technoutopians, are Vice President Al Gore, with his obsession to drive every school kid down the information superhighway; Fast Company, the fantasy magazine for middle managers waiting to man the ramparts of the tech revolution and overrun executive row; and Slate, which brings the moral arrogance (and good writing) of the old left to the new media.
To leftists, the tech revolution is the great equalizer, tearing down institutions and giving voice to the dispossessed. Among libertarians there is Virginia Postrel, the estimable editor of Reason magazine, who is working on a book with the emblematic absolutist title The Future and Its Enemies. The book, to be published by Free Press this year, divides the world into the technologically allegiant and everybody else. Libertarians are perfect candidates for technoabsolutism because mass customization and PC proliferation play to their love of anarchy, and the billions of hiding places on the Net fulfill their dream of a playground without grown-up authority.
These different camps may squabble amongst themselves, like Mensheviks and Bolsheviks, but in the end they always find common cause against anyone--politicians, thinkers, religious leaders, publishers--who doesn't share their digital dream. They are more alike than different, and their dislike of one another pales next to their contempt for anyone who would suggest that the coming paradise may instead turn out to be perdition.com.
Of all the participants in the technoabsolutist movement, none is more emblematic than Wired magazine. Although detractors dismiss it, no publication can be this successful and influential without having tapped into something essential in our culture.
In fact, the closer you study Wired the more you realize it espouses a new synthesis. On its pages you can find a buttoned-down blue-blood conservative like Gilder bumping up against a hippie rancher agrarian-anarchist like John Perry Barlow. There's also the mix of a radical posturing about personal freedom crossed with an old-left celebration of communalism, distilled through hypercapitalistic entrepreneurialism and then poured into the flask of Generation X alienation and anomie. It's a heady--perhaps lethal--mix that at any other time would simply explode. However, these aren't other times.
And on top of it all is attitude. Code words are an insider's argot; a sense of not just moral but also genetic superiority. You're supposed to feel as if you've entered midway a conversation in which the participants are speaking with arched eyebrows and ellipses, code words and euphemisms--all the while secretly laughing at you. Only the true believers--the illuminati--can really understand.
A defining moment for Wired, though perversely not in the way the magazine had hoped, came in its fifth-anniversary issue. On the cover was the manifesto of the movement in its purest form: "Change Is Good." Inside, the usual players--Barlow, Bronson, Gilder, Postrel and Nicholas Negroponte--all weighed forth. Julian Simon of the libertarian think tank the Cato Institute modestly declared that the past five years had been the greatest humanity has known; London-based technology writer John Browning once more declared the death of government; Wired contributing editor Oliver Morton extolled the advantages of genetic engineering; and Barlow happily reported from Africa that people in underdeveloped countries may offer less resistance to "becoming digital" (Negroponte's alluring and terrifying phrase) because they don't have to forget all that worthless stuff about the industrial revolution. Apparently, unlike we rich Westerners who will likely thrash about under the process, the world's poor will merely lay back and submit to their digital transformation--and no doubt their leaders will be happy to help.
Although Wired wraps these predictions in bright ribbons of optimistic layout graphics and talk of the "long boom," you can't help but get a cold feeling in the pit of your stomach when reading this stuff. There's something monstrous about such happy certainty over that which is unknowable--namely, the future. Perhaps Moore's Law is the path to paradise, but what historical precedent do we have for believing so? Does any reader of this article intuitively sense that everything is getting better thanks to the technology revolution? More likely the most you can claim is ambivalence: Thanks for the fetal monitors and anti-lock brakes, but can you please take back the alienation and the porn spam mail? And thanks for letting me work at home and adding a couple of years to my life expectancy, but could you also let me keep a few scraps of privacy? Impenetrable optimism in the face of something as profound as social revolution is its own type of immorality.
Yet it may not be enough. Mainstream publications such as Wired carry with them the seeds of their own inconsequence. To reach out to the hundreds of thousands of readers you need to land those big IBM Corp. ads (not to mention to snag a big-media patron such as S.I. Newhouse's Condé Nast Publications Ltd.), you must compromise. And absolutism, by its very nature, abhors compromise. You can't lead the revolution when you're busy handing out ad-rate cards. And thus having drawn the masses together for revolution, Wired is becoming one of its first victims.
For true believers, the only path is deeper--out onto the Internet, into obscure alternative sites that bear an uncanny resemblance to the old Algerian revolutionary cells. There, under cover of anonymity, you can utter your dark thoughts to the like-minded, achieve intellectual climax in the affirmation of your wildest conspiracies, circle-jerk your dreams of infinite bandwidth and infinitesimal human interference, and imagine yourself in intimate congress with the Web itself. In cyberspace, you can be immortal, infinite and infinitesimal, and move at the speed of light--you can be perfect.
One can laugh at the so obviously imperfect--the perverse, the genderless, the nose-ring brigades, the feral and the agoraphobic--trying to lead us to perfection. But similar claims have been made, with consistently misguided failure, against closet totalitarians for the past 200 years. People dismissed the bloodless Robespierre, too, as they did a strange Austrian corporal and a wild-haired guy in the British Museum Reading Room. Yet it is precisely these people who lead totalitarian revolutions: Only fringe folks have the time, the intensity and so little to lose.
Somewhere out there, right now, sitting at the next table at Starbucks or repairing the server at a porn site or pounding a PC keyboard with sweaty fingers, is the man or woman who will one day lead the real digital revolution. And that person won't arrive on a sealed train from Berlin or at a failed beer hall putsch but on e-mail, in Salon or on CNBC--or he might just mail bombs from a Montana shack.
But wait a minute: What, you ask, does the Unabomber--Ted Kaczynski--have to do with all this? Doesn't he oppose technology? The answer to this question takes us to the most demonic corner of technology absolutism: Mass movements become murderous when they absorb their own contradictions. The descendants of Bavarian tree-huggers, after all, built the slick, mechanistic killing machine of the Third Reich. The atheistic Soviet called upon the deity of Mother Russia to justify the liquidation of internal enemies. And it was the coffee-drinking intellectuals of the Sorbonne who ended up creating the peasant-run charnel houses of Vietnam and Cambodia.
Kaczynski, then, is only the latest incarnation, a foul-smelling prophet. And he's not anti-technology. On the contrary, like a good engineer, he was intent on making each of his bombs a technological improvement over the one that preceded it ... version 1.0, 1.1, 2.0 ... Each one a more lethal killing toy than the last. He even used the latest media to disseminate his psychotic manifesto--eerily reminiscent in tone to that of the futurists.
No, in the end, Kaczynski is a technoabsolutionist, too. Like everyone else, he accepts the inevitability of Moore's Law and the triumph of technology. He just doesn't like it--not, as he claimed, because it's inherently evil but, as he showed with his lifestyle, because he wasn't running it.
Kaczynski offers us the first glimpse of yet another faction beginning to emerge: the technofailures. Their numbers are legion, but their voices are small. They are the millions, even billions, that the technology revolution is leaving behind. In their paler form, they mutter about conspiracies and vote for Ross Perot. In their darker iterations, they join militias or hide in Montana cabins. In the Third World, they sign up with the Shining Path or the Hezbollah and slit throats in the name of anti-progress. And they all await the leader who will take them back into the future and restore the power they believe was once theirs. In the meantime, they e-mail each other angry notes about their hatred for technology and use word processors to compose fliers demanding the destruction of machines.
In his now-famous essay of two years ago for the big issue of Forbes ASAP, Tom Wolfe harkened back to the prescient writings of Nietzsche, who predicted a century ago that after we killed God, we would search for him everywhere, especially in science and technology. It was Wolfe's guess that we would eventually destroy everything we believed in making that search, tearing down one institution at a time until we were left with nothing--and only then would we again feel God's presence.
Perhaps Wolfe is right. But in the meantime, the absolutists are hardly ready to admit defeat. Having had one god that failed (Marxism), they have now found a new one on which to pin their hopes and energies: the digital revolution.
And these true believers in the digital deity will be accompanied on their long march to perfection by the hypercapitalists, chasing their own will to power up the sweeping curve of Moore's Law, tossing out the weak and wounded along the way. They will be joined by a third group, this one of outriders--the digital anarchists, who will burn and loot and hack away at every institution along the march's path. And at the front of this vast combined column will be a figure we as yet don't even know, who will be the most famous (and notorious) face of the age--the mandatory screen saver on every display. This warlord will be the smug voice of technofailures everywhere, calling on his armies of the dispossessed and disenfranchised to purge the state of the sinful, to ignite a new cultural revolution. The real Big Brother will have finally arrived, a few decades late, in Oceania.
Two years ago--to enormous controversy in Europe--six historians published The Black Book of Communism. Its thesis was simple but devastating: that the two great totalitarian systems of the 20th century, Leninism/Stalinism/Maoism and fascism/Nazism, were essentially the same. They arose from the same absolutist impulses; they were underpinned by the same pseudoscientific models; they both quickly collapsed into oligarchies; they used the same techniques to brainwash their subjects; and, ultimately, they implemented the same modes of state terror to remain in power.
It has all been said before by political philosopher Hannah Arendt and, most powerfully, in Vasily Grossman's novel Life and Fate, where the commissars and the death camp guards are all but indistinguishable. But The Black Book was the first to tackle the subject in nonfiction after the collapse of the Soviet Union and the brief opening of its state files. As the book showed in devastating detail, beneath those black shirts and SS insignias, the Order of Lenin medals and the Mao jackets, beat the same cold heart.
French communists and intellectuals howled: After all, their totalitarianism had been of the enlightened kind, bent on improving mankind rather than succumbing to the debased racial hatred that underpinned Hitler's national socialism. But The Black Book of Communism effectively destroyed that argument by showing how Stalin and his successors spent 40 years--literally from the day German panzers crossed the Vistula--convincing the world that Nazism and Stalinism were not only different but antithetical to each other. Henceforth, Nazism would be known as the reactionary culmination of evil capitalism (though it had declared itself anti-capitalist), while communism would be the ultimate (if, for now, too pure) achievement of good liberal socialism.
It was an exercise in re-education that has proved wildly effective. We still teach this Stalinist rationalization to our schoolchildren as fact. And thus, while former Nazi leaders took turns dangling from ropes in a Nuremberg gymnasium, unrepentant old Soviet functionaries today drive their Volgas past Lubyanka prison on the way to lunch at the Moscow Mickey D's.
So history isn't fair. Big surprise. Those good Germans marching in the 1871 Hermann Festival in their Visigoth outfits had no idea their parade would end at the gates of Auschwitz. So, too, those Victorian intellectuals sitting around someone's parlor singing "The International" could hardly have foreseen the Gulag Archipelago. And those Italian futurist painters in the first years of this century, with their worship of machinery, couldn't have guessed that the first use of that machinery would be to kill defenseless Ethiopian tribesman.
None of them could have predicted the horrors they were unleashing. But that doesn't spare them the blame. Their fantasies became our nightmares. The sea of blood washes back in time to splash their hands. They refused to let any practical understanding of human nature intrude on their perfect dreams; they refused to consider that, to paraphrase British newspaperman and novelist Malcolm Muggeridge, the only true human perfection--equality and peace--is found in the graveyard. In that respect, their dreams came true.
What then of today's technoabsolutists? Good intentions may one day prove their greatest crime. After all we've been through in the 20th century, can there be any excuse for yet another quest for human perfection in the 21st? Sure, the spit and sperm and sweat of real human existence is messy and troublesome, especially when compared with the clean, orderly ranks of integrated circuits on a motherboard. And, compared with the sweep of Moore's Law, human "progress" seems like a bad joke, an oxymoron. Yet in the end, it's all we've got. We're doomed to be the toolmaker and never the tool. And the further we stray from a healthy appreciation of our contradictory selves, the more we stretch toward the latest grail of perfection, the more likely we are to leave the back door open to the darker, Dionysian part of our nature.
They're out there waiting: the stepchildren of technoreactionaries such as Gilder, of technoimperialists such as Barlow and Dyson, of technoanarchists such as Postrel, of legions of technofailures, and of technoutopians like Gore and, God help me, myself. All those happy children are now good technofascists, genetically pure technojugen in their chip-embedded brown shirts, marching in lockstep on the Sudetenland of the computer illiterate, the unbrilliant and the imperfect. Singing songs of freedom through technology. Joyfully building the 1,000-year Digital Reich.