Living Through the Age of Denial in America

Send me a postcard, drop me a line,
Stating point of view.
Indicate precisely what you mean to say
Yours sincerely, Wasting Away.
– the Beatles, “When I’m 64”

I set foot, so to speak, on this planet on July 20, 1944, not perhaps the best day of the century. It was, in fact, the day of the failed German officers’ plot to assassinate Adolf Hitler.

My mother was a cartoonist. She was known in those years as “New York’s girl caricaturist,” or so she’s called in a newspaper ad I still have, part of a war-bond drive in which your sizeable bond purchase was to buy her sketch of you. She had, sometime in the months before my birth, traveled by train, alone, the breadth of a mobilized but still peaceable American continent to visit Hollywood on assignment for some magazine to sketch the stars. I still have, on my wall, a photo of her in that year on the “deck” of a “pirate ship” on a Hollywood lot drawing one of those gloriously handsome matinee idols. Since I was then inside her, this is not exactly part of my memory bank. But that photo does tell me that, like him, she, too, was worth a sketch.

Certainly, it was appropriate that she drew the card announcing my birth. There I am in that announcement, barely born and already caricatured, a boy baby in nothing but diapers – except that, on my head, I’m wearing my father’s dress military hat, the one I still have in the back of my closet, and, of course, I’m saluting. “A Big Hello – From Thomas Moore Engelhardt,” the card says. And thus was I officially recorded entering a world at war.

By then, my father, a major in the U.S. Army Air Corps and operations officer for the 1st Air Commando Group in Burma, had, I believe, been reassigned to the Pentagon. Normally a voluble man, for the rest of his life he remained remarkably silent on his wartime experiences.

I was, in other words, the late child of a late marriage. My father, who, just after Pearl Harbor, at age 35, volunteered for the military, was the sort of figure that the – on average – 26-year-old American soldiers of World War II would have referred to as “pops.”

He, like my mother, departed this planet decades ago, and I’m still here. So think of this as? what? No longer, obviously, a big hello from Thomas Moore Engelhardt, nor – quite yet – a modest farewell, but perhaps a moderately late report from the one-man commission of me on the world of peace and war I’ve passed through since that first salute.

On Imagining Myself as Burnt Toast

Precisely what do I mean to say now that I’m just a couple of weeks into my 65th year on this planet?

Let me start this way: If, on the evening of October 22, 1962, you had told me that, in 2008, America’s most formidable enemy would be Iran, I would have danced a jig. Well, maybe not a jig, but I’ll tell you this: I would have been flabbergasted.

On that October evening, President John F. Kennedy went before the nation – I heard him on radio – to tell us all that Soviet missile sites were just then being prepared on the island of Cuba with “a nuclear strike capability against the Western Hemisphere.” It was, he said, a “secret, swift and extraordinary buildup of communist missiles – in an area well known to have a special and historical relationship to the United States and the nations of the Western Hemisphere.” When fully operational, those nuclear-tipped weapons would reach “as far north as Hudson Bay, Canada, and as far south as Lima, Peru.” I certainly knew what Hudson Bay, far to the north, meant for me.

“It shall be the policy of this nation,” Kennedy added ominously, “to regard any nuclear missile launched from Cuba against any nation in the Western Hemisphere as an attack on the United States, requiring a full retaliatory response upon the Soviet Union.” And he ended, in part, this way: “My fellow citizens: let no one doubt that this is a difficult and dangerous effort on which we have set out. No one can foresee precisely what course it will take or what costs or casualties will be incurred?”

No one could mistake the looming threat: Global nuclear war. Few of us listeners had seen the highly classified 1960 SIOP (Single Integrated Operational Plan) in which the U.S. military had made its preparations for a massive first strike of 3,200 nuclear weapons against the communist world. It was supposed to take out at least 130 cities, with estimated casualties approaching 300 million, but, even without access to that SIOP, we – I – knew well enough what might be coming. After all, I had seen versions of it, perfectly unclassified, in the movies, even if the power to destroy on a planetary scale was transposed to alien worlds, as in that science fiction blockbuster of 1955 “This Island Earth,” or imputed to strange alien rays, or rampaging radioactive monsters. Now, here it was in real life, my life, without an obvious director, and the special effects were likely to be me, dead.

It was the single moment in my life – which tells you much about the life of an American who didn’t go to war in some distant land – when I truly imagined myself as prospective burnt toast. I really believed that I might not make it out of the week, and keep in mind, I was then a freshman in college, just 18 years old and still wondering when life was slated to begin. Between 1939 and 2008, across much of the world, few people could claim to have escaped quite so lightly, not in that near three-quarters of a century in which significant portions of the world were laid low.

Had you, a seer that terrifying night, whispered in my ear the news about our enemies still distant decades away, the Iranians, the… are you kidding?… Iraqis, or a bunch of fanatics in the backlands of Afghanistan and a tribal borderland of Pakistan… well, it’s a sentence that would, at the time, have been hard to finish. Death from Waziristan? I don’t think so.

Truly, that night, if I had been convinced that this was “my” future – that, in fact, I would have a future – I might have dropped to my knees in front of that radio from which Kennedy’s distinctive voice was emerging and thanked my lucky stars; or perhaps – and this probably better fits the public stance of an awkward, self-conscious 18-year-old – I would have laughed out loud at the obvious absurdity of it all. (“The absurd” was then a major category in my life.) Fanatics from Afghanistan? Please?

That we’re here now, that the world wasn’t burnt to a crisp in the long superpower standoff of the Cold War, well, that still seems little short of a miracle to me, a surprise of history that offers hope? of a sort. The question, of course, is: Why, with this in mind, don’t I feel better, more hopeful, now?

After all, if offered as a plot to sci-fi movie directors of that long-gone era – perfectly willing to populate Los Angeles with giant, mutated, screeching ants (Them!), the Arctic with “The Thing From Another World,” and Washington D.C. with an alien and his mighty robot, capable of melting tanks or destroying the planet (“Klaatu barada nikto!”) – our present would surely have been judged too improbable for the screen. They wouldn’t have touched it with a ten-foot pole, and yet that’s what actually came about – and the planet, a prospective cinder (along with us prospective cinderettes) is, remarkably enough, still here.

Or to put this in a smaller, grimmer way, consider the fate of the American military base at Guantanamo – an extra-special symbol of that “special and historical relationship” mentioned by Kennedy between the small island of Cuba and its giant “neighbor” to the northwest. In that address to the nation in 1962, the president announced that he was reinforcing the base, even as he was evacuating dependents from it. And yet, like me in my 65th year, it, too, survived the Cuban Missile Crisis unscathed. Some four decades later, in fact, it was still in such a special and historical relationship with Cuba that the Bush administration was able to use it to publicly establish all its new categories of off-shore injustice – its global mini-gulag of secret prisons, its public policies of torture, detention without charges, disappearance, you name it. None of which, by the way, would the same set of directors have touched with the same pole. Back in the 1950s, only Nazis, members of the Japanese imperial Army, and KGB agents could publicly relish torture on screen. The FOX TV show “24” is distinctly an artifact of our moment.

A Paroxysm of Destruction Only a Few Miles Wide

Of course, back in 1962, even before Kennedy spoke, I could no more have imagined myself 64 than I could have imagined living through “World War IV” – as one set of neocons loved to call the President’s Global War on Terror – a “war” to be fought mainly against thousands of Islamist fanatics scattered around the planet and an “axis of evil” consisting of three relatively weak regional powers. I certainly expected bigger, far worse things. And little wonder: When it came to war, the full weight of the history of most of the last century pointed exponentially in the direction of a cataclysm with few or no survivors.

From my teen years, I was, you might say, of the Tom Lehrer school of life (as in the lyrics from his 1959 song, “We Will All Go Together When We Go”) – and I was hardly alone:

We will all fry together when we fry.
We’ll be french fried potatoes by and by.
There will be no more misery
When the world is our rotisserie.
Yes, we will all fry together when we fry?

And we’ll all bake together when we bake,
They’ll be nobody present at the wake.
With complete participation
In that grand incineration,
Nearly three billion hunks of well-done steak.

I was born, after all, just a year and a few weeks before the United States atomically incinerated Hiroshima and then followed up by atomically obliterating the city of Nagasaki, and World War II ended. Victory arrived, but amid scenes of planetary carnage, genocide, and devastation on a scale and over an expanse previously unimaginable.

In these last years, the Bush administration has regularly invoked the glories of the American role in World War II and of the occupations of Germany and Japan that followed. Even before then, Americans had been experiencing something like a “greatest generation” fest (complete with bestselling books, a blockbuster movie, and two multi-part greatest-gen TV mini-series). From the point of view of the United States, however, World War II was mainly a “world” war in the world that it mobilized, not in the swath of the planet it turned into a charnel house of destruction. After all, the United States (along with the rest of the “New World”) was left essentially untouched by both “world” wars. North Africa, the Middle East, and New Guinea all suffered incomparably more damage. Other than a single attack on the American fleet at Hawaii, thousands of miles from the U.S. mainland, on December 7, 1941, the brief Japanese occupation of a couple of tiny Aleutian islands off Alaska, a U-boat war off its coasts, and small numbers of balloon fire bombs that drifted from Japan over the American west, this continent remained peaceable and quite traversable by a 35-year-old theatrical caricaturist in the midst of wartime.

For Americans, I doubt that the real import of that phrase World War – of the way the industrial machinery of complete devastation enveloped much of the planet in the course of the last century – ever quite came home. There had, of course, been world, or near-world, or “known world” wars in the past, even if not thought of that way. The Mongols, after all, had left the steppes of northeastern Asia and conquered China, only being turned back from Japan by the first kamikaze (“divine wind”) attacks in history, typhoons which repelled the Mongol fleet in 1274 and again in 1281. Mongol horsemen, however, made their way west across the Eurasian continent, conquering lands and wreaking havoc, reaching the very edge of Europe while, in 1258, sacking and burning Baghdad. (It wouldn’t happen again until 2003.) In the eighteenth and early nineteenth centuries, the British and French fought something closer to a “world war,” serial wars actually in and around Europe, in North Africa, in their New World colonies and even as far away as India, as well as at sea wherever their ships ran across one another.

Still, while war may have been globalizing, it remained, essentially, a locally or regionally focused affair. And, of course, in the decades before World War I, it was largely fought on the global peripheries by European powers testing out, piecemeal, the rudimentary industrial technology of mass slaughter – the machine gun, the airplane, poison gas, the concentration camp – on no one more significant than benighted “natives” in places like Iraq, the Sudan, or German Southwest Africa. Those locals – and the means by which they died – were hardly worthy of notice until, in 1914, Europeans suddenly, unbelievably, began killing other Europeans by similar means and in staggering numbers, while bringing war into a new era of destruction. It was indeed a global moment.

While the American Civil War had offered a preview of war, industrial-style, including trench warfare and the use of massed firepower, World War I offered the first full-scale demonstration of what industrial warfare meant in the heartlands of advanced civilization. The machine gun, the airplane, and poison gas arrived from their testing grounds in the colonies to decimate a generation of European youth, while the tank, wheeled into action in 1916, signaled a new world of rapid arms advances to come. Nonetheless, that war – even as it touched the Middle East, Africa, and Asia – wasn’t quite imagined as a “world war” while still ongoing. At the time, it was known as the Great War.

Though parts of Tsarist Russia were devastated, the most essential, signature style of destruction was anything but worldwide. It was focused – like a lens on kindling – on a strip of land that stretched from the Swiss border to the Atlantic Ocean, running largely through France, and most of the time not more than a few miles wide. There, on “the Western front,” for four unbelievable years, opposing armies fought – to appropriate an American term from the Vietnam War – a “meat grinder” of a war of a kind never seen before. “Fighting,” though, hardly covered the event. It was a paroxysm of death and destruction.

That modest expanse of land was bombarded by many millions of shells, torn up, and thoroughly devastated. Every thing built on, or growing upon it, was leveled, and, in the process, millions of young men – many tens of thousands on single days of “trench warfare” – were mercilessly slaughtered. After those four unbearably long years, the Great War ended in 1918 with a whimper and in a bitter peace in the West, while, in the East, amid civil war, the Bolsheviks came to power. The semi-peace that followed turned out to be little more than a two-decade armistice between bloodlettings.

We’re talking here, of course, about “the war to end all wars.” If only.

World War II (or the ever stronger suspicion that it would come) retrospectively put that “I” on the Great War and turned it into the First World War. Twenty years later, when “II” arrived, the world was industrially and scientifically prepared for new levels of destruction. That war might, in a sense, be imagined as the extended paroxysm of violence on the Western front scientifically intensified – after all, air power had, by then, begun to come into its own – so that the sort of scorched-earth destruction on that strip of trench-land on the Western Front could now be imposed on whole countries (Japan), whole continents (Europe), almost inconceivable expanses of space (all of Russia from Moscow to the Polish border where, by 1945, next to nothing would remain standing ). Where there had once been “civilization,” after the second global spasm of sustained violence little would be left but bodies, rubble, and human scarecrows striving to survive in the wreckage. With the Nazi organization of the Holocaust, even genocide would be industrialized and the poison gas of the previous World War would be put to far more efficient use.

This was, of course, a form of “globalization,” though its true nature is seldom much considered when Americans highlight the experiences of that greatest generation. And no wonder. Except for those soldiers fighting and dying abroad, it simply wasn’t experienced by Americans. It’s hard to believe now that, in 1945, the European civilization that had experienced a proud peace from 1871-1914 while dominating two-thirds of the planet lay in utter ruins; that it had become a site of genocide, its cities reduced to rubble, its fields laid waste, its lands littered with civilian dead, its streets flooded by refugees: a description that in recent times would be recognizable only of a place like Chechnya or perhaps Sierra Leone.

Of course, it wasn’t the First or Second, but the Third “World War” that took up almost the first half-century of my own life, and that, early on, seemed to be coming to culmination in the Cuban Missile Crisis. Had the logic of the previous wars been followed, a mere two decades after the “global,” but still somewhat limited, devastation of World War II, war’s destruction would have been exponentially upped once again. In that brief span, the technology – in the form of A- and H-bombs, and the air fleets to go with them, and of nuclear-tipped intercontinental ballistic missiles – was already in place to transform the whole planet into a version of those few miles of the Western front, 1914-1918. After a nuclear exchange between the superpowers, much of the world could well have been burnt to a crisp, many hundreds of millions or even billions of people destroyed, and – we now know – a global winter induced that might conceivably have sent us in the direction of the dinosaurs.

The logic of war’s developing machinery seemed to be leading inexorably in just that direction. Otherwise, how do you explain the way the United States and the Soviet Union, long after both superpowers had the ability to destroy all human life on Planet Earth, simply could not stop upgrading and adding to their nuclear arsenals until the U.S. had about 30,000 weapons sometime in the mid-1960s, and Soviets about 40,000 in the 1980s. It was as if the two powers were preparing for the destruction of many planets. Such a war would have given the fullest meaning to “world” and no ocean, no line of defenses, would have left any continent, any place, out of the mix. This is what World War III, whose name would have had to be given prospectively, might have meant (and, of course, could still mean).

Or think of the development of “world war” over the twentieth century another way. It was but a generation, no more, from the first flight of the Wright brothers at Kitty Hawk to the 1,000-bomber raid. In 1903, one fragile plane flies 120 feet. In 1911, an Italian lieutenant in another only slightly less fragile plane, still seeming to defy some primordial law, drops a bomb on an oasis in North Africa. In 1944 and 1945, those 1,000 plane air armadas take off to devastate German and Japanese cities.

On August 6, 1945, all the power of those armadas was compacted into the belly of a lone B-29, the Enola Gay, which dropped its single bomb on Hiroshima, destroying the city and many of its inhabitants. All this, again, took place in little more than a single generation. In fact, Paul Tibbets, who piloted the Enola Gay, was born only 12 years after the first rudimentary plane took to the air. And only seven years after Japan surrendered, the first H-bomb was tested, a weapon whose raw destructive power made the bomb that destroyed Hiroshima look like a mere bagatelle.

Admittedly, traces of humanity remained everywhere amid the carnage. After all, the plane that carried that first bomb was named after Tibbets’s mother, and the bomb itself dubbed “Little Boy,” as if this were a birthing experience. The name of the second plane, Bockscar, was nothing but a joke based on similarity of the name of its pilot, Frederick Bock, who didn’t even fly it that day, and a railroad “boxcar.” But events seemed to be pushing humanity toward the inhuman, toward transformation of the planet into a vast Death Camp, toward developments which no words, not even “world war,” seemed to capture.

Entering the Age of Denial

It was, of course, this world of war from which, in 1945, the United States emerged triumphant. The Great Depression of the 1930s would, despite wartime fears to the contrary, not reappear. On a planet many of whose great cities were now largely rubble, a world of refugee camps and privation, a world destroyed (to steal the title of a book on the dropping of the atomic bomb), the U.S. was untouched.

The world war had, in fact, leveled all its rivals and made the U.S. a powerhouse of economic expansion. That war and the atomic bomb had somehow ushered in a golden age of abundance and consumerism. All the deferred dreams and desires of depression and wartime America – the washing machine, the TV set, the toaster, the automobile, the suburban house, you name it – were suddenly available to significant numbers of Americans. The U.S. military began to demobilize and the former troops returned not to rubble, but to new tract homes and G.I. Bill educations.

The taste of ashes may have been in global mouths, but the taste of nectar (or, at least, Coca Cola) was in American ones. And yet all of this was shadowed by our own “victory weapon,” by the dark train of thought that led quickly to scenarios of our own destruction in newspapers and magazines, on the radio, in movies, and on TV (think, “The Twilight Zone”), as well as in a spate of novels that took readers beyond the end of the world and into landscapes involving irradiated, hiroshimated futures filled with “mutants” and survivalists. The young, with their own pocket money to spend just as they pleased for the first time in history – teens on the verge of becoming “trend setters” – found themselves plunged into a mordant, yet strangely thrilling world, as I’ve written elsewhere, of “triumphalist despair.”

At the economic and governmental level, the 24/7 world of sunny consumerism increasingly merged with the 24/7 world of dark atomic alerts, of ever vigilant armadas of nuclear-armed planes ready to take off on a moment’s notice to obliterate the Soviets. After all, the peaceable giants of consumer production now doubled as the militarized giants of weapons production. A military Keynesianism drove the U.S. economy toward a form of consumerism in which desire for the ever larger car and missile, electric range and tank, television console and atomic submarine was wedded in single corporate entities. The companies – General Electric, General Motors, and Westinghouse, among others – producing the icons of the American home were also major contractors developing the weapons systems ushering the Pentagon into its own age of abundance.

In the 1950s, then, it seemed perfectly natural for Charles Wilson, president of General Motors, to become secretary of defense in the Eisenhower administration, just as retiring generals and admirals found it natural to move into the employ of corporations they had only recently employed on the government’s behalf. Washington, headquarters of global abundance, was also transformed into a planetary military headquarters. By 1957, 200 generals and admirals as well as 1,300 colonels or naval officers of similar rank, retired or on leave, worked for civilian agencies, and military funding spilled over into a Congress that redirected its largesse to districts nationwide.

Think of all this as the beginning not so much of the American (half) Century, but of an American Age of Denial that lasted until? well, I think we can actually date it? until September 11, 2001, the day that “changed everything.” Okay, perhaps not “everything,” but, by now, it’s far clearer just what the attacks of that day, the collapse of those towers, the murder of thousands, did change – and of just how terrible, how craven but, given our previous history, how unsurprising the response to it actually was.

Those dates – 1945-2001 – 56 years in which life was organized, to a significant degree, to safeguard Americans from an “atomic Pearl Harbor,” from the thought that two great oceans were no longer protection enough for this continent, that the United States was now part of a world capable of being laid low. In those years, the sun of good fortune shone steadily on the U.S. of A., even as American newspapers, just weeks after Hiroshima, began drawing concentric circles of destruction around American cities and imagining their future in ruins. Think of this as the shadow story of that era, the gnawing anxiety at the edge of abundance, like those memento mori skulls carefully placed amid cornucopias in seventeenth-century Dutch still-life paintings.

In those decades, the “arms race” never abated, not even long after both superpowers had a superabundant ability to take each other out. World-ending weaponry was being constantly “perfected” – MIRVed, put on rails, divided into land, sea, and air “triads,” and, of course, made ever more powerful and accurate. Nonetheless, Americans, to take Herman Kahn’s famous phrase, preferred most of the time not to think too much about “the unthinkable” – and what it meant for them.

As the 1980s began, however, in a surge of revulsion at decades of denial, a vast anti-nuclear movement briefly arose – in 1982, three-quarters of a million people marched against such weaponry in New York City – and President Ronald Reagan responded with his lucrative (for the weapons industry) fantasy scheme of lofting an “impermeable shield” against nuclear weapons into space, his “Star Wars” program. And then, in an almost-moment as startling as it was unexpected, in 1986, in Reykjavik, Iceland, Reagan and Soviet leader Mikhail Gorbachev almost made such a fantasy come true, not in space, but right here on planet Earth. They came to the very “brink” – to use a nuclear-crisis term of the time – of a genuine program to move decisively down the path to the abolition of such weapons. It was, in some ways, the most hopeful almost-moment of a terrible century and, of course, it failed.

Thanks largely, however, to one man, Gorbachev, who consciously chose a path of non-violence, after four decades of nuclear standoff in a fully garrisoned MAD (mutually assured destruction) world – and to the amazement, even disbelief, of official Washington – the USSR simply disappeared, and almost totally peaceably at that.

You could measure the era of denial up to that moment both by the level of official resistance to recognizing this obvious fact and by the audible sigh of relief in this country. Finally, it was all over. It was, of course, called “victory,” though it would prove anything but.

And only then did the MADness really began. Though there was, in the U.S., modest muttering about a “peace dividend,” the idea of “peace” never really caught hold. The thousands of weapons in the U.S. nuclear arsenal, which had seemingly lost their purpose and whose existence should have been an embarrassing reminder of the Age of Denial, were simply pushed further into the shadows and largely ignored or forgotten. Initially assigned no other tasks, and without the slightest hiccup of protest against them, they were placed in a kind of strategic limbo and, like the mad woman in the attic, went unmentioned for years.

In the meantime, it was clear by century’s end that the “peace dividend” would go largely to the Pentagon. At the very moment when, without the Soviet Union, the U.S. might have accepted its own long-term vulnerability and begun working toward a world in which destruction was less obviously on the agenda, the U.S. government instead embarked, like the Greatest of Great Powers (the “new Rome,” the “new Britain”), on a series of neocolonial wars on the peripheries. It began building up a constellation of new military bases in and around the oil heartlands of the planet, while reinforcing a military and technological might meant to brook no future opponents. Orwell’s famous phrase from his novel 1984, “war is peace,” was operative well before the second Bush administration entered office.

Call this a Mr. Spock moment, one where you just wanted to say “illogical.” With only one superpower left, the American Age of Denial didn’t dissipate. It only deepened and any serious assessment of the real planet we were all living on was carefully avoided.

In these years, the world was essentially declared to be “flat” and, on that “level playing field,” it was, we were told, gloriously globalizing. This official Age of Globalization – you couldn’t look anywhere, it seemed, and not see that word – was proclaimed another fabulously sunny era of wonder and abundance. Everyone on the planet would now wear Air Jordan sneakers and Mickey Mouse T-shirts, eat under the Golden Arches, and be bombarded with “information”? Hurrah!

News was circling the planet almost instantaneously in this self-proclaimed new Age of Information. (Oh yes, there were many new and glorious “ages” in that brief historical span of self-celebration.) But with the Soviet Union in the trash bin of history – forget that Russia, about to become a major energy power, still held onto its nuclear forces – and the planet, including the former Soviet territories in Eastern Europe and Central Asia open to “globalizing” penetration, few bothered to mention that other nexus of forces which had globalized in the previous century: the forces of planetary destruction.

And Americans? Don’t think that George W. Bush was the first to urge us to “sacrifice” by spending our money and visiting Disney World. That was the story of the 1990s and it represented the deepest of all denials, a complete shading of the eyes from any reasonably possible future. If the world was flat, then why shouldn’t we drive blissfully right off its edge? The SUV, the subprime mortgage, the McMansion in the distant suburb, the 100-mile commute to work? you name it, we did it. We paid the price, so to speak.

And while we were burning oil and spending money we often didn’t have, and at prodigious rates, “globalization” was slowly making its way to the impoverished backlands of Afghanistan.

A Fierce Rearguard Action for Denial

This, of course, brings us almost to our own moment. To the neocons, putting on their pith helmets and planning their Project for a New American Century (meant to be just like the old nineteenth century, only larger, better, and all-American), the only force that really mattered in the world was the American military, which would rule the day, and the Bush administration, initially made up of so many of them, unsurprisingly agreed. This would prove to be one of the great misreadings of the nature of power in our world.

Since what’s gone before in this account has been long, let me make this – our own dim and dismal moment – relatively short and sweet. On September 11, 2001, the Age of Denial ended in the “mushroom cloud” of the World Trade Center. It was no mistake that, within 24 hours, the site where the towers had gone down was declared to be “Ground Zero,” a term previously reserved for an atomic explosion. Of course, no such explosion had happened, nor had an apocalypse of destruction actually occurred. No city, continent, or planet had been vaporized, but for Americans, secretly waiting all those decades for their “victory weapon” to come home, it briefly looked that way.

The shock of discovering for the first time and in a gut way that the continental United States, too, could be at some planetary epicenter of destruction was indeed immense. In the media, apocalyptic moments – anthrax, plagues, dirty bombs – only multiplied and most Americans, still safe in their homes, hunkered down in fear to await various doom-laden scenarios that would never happen. In the meantime, other encroaching but unpalatable globalizing realities, ranging from America’s “oil addiction” to climate change, would continue to be assiduously ignored. In the U.S., this was, you might say, the real “inconvenient truth” of these years.

The response to 9/11 was, to say the least, striking – and craven in the extreme. Although the Bush administration’s Global War on Terror (aka World War IV) has been pictured many ways, it has never, I suspect, been seen for what it most truly may have been: a desperate and fierce rearguard action to extend the American Age of Denial. We would, as the President urged right after 9/11, show our confidence in the American system by acting as though nothing had happened and, of course, paying that visit to Disney World. In the meantime, as “commander-in-chief” he would wall us in and fight a “global war” to stave off the forces threatening us. Better yet, that war would once again be on their soil, not ours, forever and ever, amen.

The motto of the Bush administration might have been: Pay any price. Others, that is, would pay any price – disappearance, torture, false imprisonment, death by air and land – for us to remain in denial. A pugnacious and disastrous “war” on terrorism, along with sub-wars, dubbed “fronts” (central or otherwise), would be pursued to impose our continuing Age of Denial by force on the rest of the planet (and soften the costs of our addiction to oil). This was to be the new Pax Americana, a shock-and-awe “crusade” (to use a word that slipped out of the President’s mouth soon after 9/11) launched in the name of American “safety” and “national security.” Almost eight years later, as in the present presidential campaign of 2008, these remain the idols to which American politicians, the mainstream media, and assumedly many citizens continue to do frightened obeisance.

The message of 9/11 was, in truth, clear enough – quite outside the issue of who was delivering it for what purpose. It was: Here is the future of the United States; try as you might, like it or not, you are about to become part of the painful, modern history of this planet.

And the irony that went with it was this: The fiercer the response, the more we tried to force the cost of denial of this central reality on others, the faster history – that grim shadow story of the Cold War era – seemed to approach.

Postcard from the Edge

What I’ve written thus far hasn’t exactly been a postcard. But if I were to boil all this down to postcard size, I might write:

Here’s our hope: History surprised us and we got through. Somehow. In that worst of all centuries, the last one, the worst didn’t happen, not by a long shot.

Here’s the problem: It still could happen – and, 64 years later, in more ways than anyone once imagined.

Here’s a provisional conclusion: And it will happen, somehow or other, unless history surprises us again, unless, somehow or other, we surprise ourselves and the United States ends its age of denial.

And a little p.s.: It’s not too late. We – we Americans – could still do something that mattered when it comes to the fate of the Earth.

[Note for Readers: Those of you interested in more on these topics might check out The End of Victory Culture, my history of the Cold War Age of Denial, in its latest updated edition. I certainly stole from it for this piece and it’s guaranteed to take you on a mad gallop through the various strangenesses of American life, emphasizing popular culture, from 1945 to late last night. It’s a book that Juan Cole has labelled a “must read” and that Studs Terkel called “as powerful as a Joe Louis jab to the solar plexus.”

On another “front,” back in 1982, Jonathan Schell first took up the (nuclear) fate of the Earth in his bestselling book of the same name. He’s never put the subject down. He returned to it most recently and tellingly in The Seventh Decade: The New Shape of Nuclear Danger, the paperback of which is due to be published this September. I am deeply indebted to him for the development of my own thinking on the subject.

On this piece, my special thanks as well to Christopher Holmes for help above and beyond the call of duty.]

Copyright 2008 Tom Engelhardt

Author: Tom Engelhardt

An editor in publishing for the last 25 years, Tom Engelhardt is the author of The End of Victory Culture, a history of American triumphalism in the Cold War era, now out in a revised edition with a new preface and afterword, and Mission Unaccomplished, TomDispatch Interviews With American Iconoclasts and Dissenters. He is at present consulting editor for Metropolitan Books, a fellow of the Nation Institute, and a teaching fellow at the journalism school of the University of California, Berkeley. Visit his Web site. This article originally appeared at TomDispatch.com. To stay on top of important articles like these, sign up to receive the latest updates from TomDispatch.com.