Saturday, September 25, 2021

Sept. 11 and the Future of American History


 

Article by Niall Ferguson in Bloomberg Opinion


Sept. 11 and the Future of American History

Twenty years after the horrific attacks on New York and Washington, it’s clear that the biggest changes of our time were not ideological or geopolitical, but technological. They were also the hardest to foresee.

 

The public wants prophets. The historian writes stories about the past, but what the public wants is the history of the future. This leads to a paradox. The prophet since the time of Cassandra has largely gone unheeded. However, only the unheeded prophet has her prophecies fulfilled. If the prophet is heeded, then disaster may be averted — and the prophecy negated.

These reflections are prompted by the 20th anniversary of the Sept. 11 terrorist attacks. Before the attacks, there were prophets who foresaw such a disaster, not least Richard A. Clarke, the National Security Council’s counterterrorism adviser. But it was precisely his inability to persuade George W. Bush’s administration of the imminence of al-Qaeda’s attack on America that ensured it happened. Similarly, if more people had been persuaded by Harvard professor Samuel Huntington’s warning in 1993 of a new “clash” between  Western and Islamic civilizations after the Cold War, perhaps that clash might have been averted and the prediction proved false. Instead, we had believed an earlier prophet: Francis Fukuyama, who in 1989 had proclaimed “the end of history.”

The disaster of 9/11 was deeply shocking: the ruthless fanaticism of the suicidal hijackers, the suddenness of the World Trade Center’s collapse, the helplessness of most of the victims. On top of the trauma came the uncertainty: What would happen next? We needed new prophecies. If you are in the business of prognostication, it is a good practice — indeed, it is one of the first principles of Philip Tetlock’s “superforecasting” — to look back and see how you did. I had mixed success.

On Sept. 20, 2001, I observed that the U.S. lacked experience of terrorism within its own borders and would likely overreact to the attacks, ignoring the lessons learned by European governments over many decades, and lashing out in ways that might well backfire.

In an essay for the New York Times Magazine, published on Dec. 2, I hazarded four predictions. First, the need for higher levels of domestic security would transform daily life, introducing a range of intrusions and inconveniences that would be persistent. Second, 9/11 would not alter the country’s heavy dependence on imported oil. A new oil shock was coming, I predicted, especially if Osama bin Laden could achieve his goal of toppling the Saudi monarchy. Third, I suggested that the U.S. would likely retaliate for the attacks by expanding its already considerable military presence abroad. I anticipated a “transition of American global power from informal to formal imperialism.” Finally, I foresaw less a clash of civilizations than a continued process of political fragmentation (“From Yugoslavia to Iraq to Afghanistan, what the United States keeps having to confront is not a united Islam but a succession of fractured polities, racked by internecine war”).

Some of this was right and some of it was wrong. I was right that the imperatives of “homeland security” would change our lives in many ways, from endless security lines at airports to a more visible police and paramilitary presence in vulnerable locations. I was right about the way the U.S. would expand its quasi-imperial activities, first in Afghanistan and then in Iraq. And I was right that these interventions would end up being divisive (“the divisions between ethnic and religious groups in the United States — indeed throughout the world — will be even more pronounced”).

The striking defect of my analysis was its failure to take account of technological change. I underestimated the extent to which law enforcement’s antiterrorism measures would be unobtrusive, focusing largely on electronic communications on the internet and cellular networks to track groups like Qaeda. I also underestimated the ways in which new technologies such as fracking and improvements in renewable energy would reduce U.S. reliance on imported oil. In other words, I was thinking in 20th-century terms about 21st-century phenomena. That which had to do with the perennial nature of power I got roughly right; that which depended on the disruptive power of technology I got wrong.

Looking back 20 years, we are bound to ask how far the prophecies of that time helped avoid disasters. It has often been observed that 9/11 remains significant precisely for its singularity. It did not happen again. Indeed, successful Islamist terrorist activity in the U.S. declined and more or less ended after 2016. These days, it is often argued that white supremacist and neo-Nazi groups pose a bigger threat than Islamic fanatics.

Why was there only one 9/11? Was it, as many soldiers I have spoken to over the years asserted, that by fighting terrorism in Iraq and Afghanistan they succeeded in protecting their families at home from it? Or, more plausibly, did America’s so-called war on terror lead to higher levels of terrorism in the countries it invaded, magnetically attracting the world’s jihadists? It is tempting to assert, though one cannot prove it, that the other 9/11s were prevented mainly by effective intelligence gathering and counterterrorism activity at home, rather than by expensive military efforts abroad that rapidly shifted from regime change to nation-building to counterinsurgency, at the cost of thousands of American lives and hundreds of thousands of others.

The trends that can kill you, however, are the trends you altogether miss. I failed to see in 2001 that the rise of China would ultimately pose a bigger strategic challenge to the U.S. than Islamic fundamentalism. This was something a shrewder prophet might already have anticipated 10 or even 20 years earlier.


Historical turning points can only be understood with the use of counterfactuals. What if 9/11 had not happened — if Richard Clarke and the other Cassandras had been heeded, and the attackers thwarted? We can be fairly sure Afghanistan would never have been invaded. But Iraq? So cynically was 9/11 used as a pretext for toppling Saddam Hussein that I can only believe another pretext would have been found. In that sense, a world without 9/11 might still have been a world with Abu Ghraib prison. To imagine a world without the Iraq War you need a different counterfactual — the one in which Al Gore wins the 2000 election.

In his autobiography, the Oxford philosopher of history R. G. Collingwood compared historians to “woodsmen.” “The past lives on in the present,” he wrote. “Though incapsulated in it, and at first sight hidden beneath the present’s contradictory and more prominent features, it is still alive and active.” The historian is thus “related to the nonhistorian as the trained woodsman is to the ignorant traveler. ‘Nothing here but trees and grass,’ thinks the traveler, and marches on. ‘Look,’ says the woodsman, ‘there is a tiger in that grass.’ ”

The problem in September 2001 was a different one, however. The political class in Washington was not oblivious to something in the grass, but they thought they saw a huge fire-breathing dragon. Sept. 11 was Pearl Harbor. Qaeda were “Islamo-fascists.” Later, Saddam Hussein was Hitler and Baghdad in 2003 would be Paris in 1944. In reality, it was just a tiger — the threat of jihadism a mere rounding error compared with the fascist regimes of World War II.

A better analogy than Pearl Harbor might have been the assassination of John F. Kennedy. We cannot be sure that the U.S. wouldn’t have escalated its involvement in Vietnam had Lyndon Johnson remained vice president, of course, just as we cannot be sure about the invasion of Iraq, absent 9/11. But there is more than a faint resemblance: A spectacular, almost sacrilegious domestic act of terror, followed by a great bloodletting abroad.

The Vietnam War led indirectly to the inflation that began in the late 1960s and hit double digits in the 1970s. The war on terror had quite different economic consequences. Recurrent deficits, a monetary policy that disregarded asset bubbles, and a lax regulatory environment led not to inflation but to the danger of deflation. In 2006 and 2007, I foresaw with reasonable precision the cascade of financial disaster that led from subprime mortgages via collateralized debt obligations to a global banking crisis. But I got the post-2008 path of interest rates quite wrong, and I failed to anticipate the “secular stagnation” that seemed to set in during the Barack Obama presidency.  


Looking back on that period, I see a familiar pattern: underestimating the impact of technological change on all our daily lives. When I had begun writing journalistic prophecies in the late 1980s, I had bashed them out on a typewriter. By 1989, I owned an Amstrad word processor and a simple portable device (it was called a Tandy) that could send articles electronically. For the next few years, I grew accustomed to the hideous noises emitted by modems using dial-up connections. Imperceptibly yet rapidly, email took over my correspondence, then my life. Google stealthily replaced the university library. Then came the BlackBerry, then the iPhone. Microsoft programs — Word, Excel and PowerPoint — became the (occasionally broken) windows through which I wrote, calculated and presented.

My articles were published on clunky newspaper websites, while my books were sold on Amazon. Then my writing began to be “smacked down” on unfiltered blogs. In June 2009, encouraged by my teenage children and my publishers (but against my better judgment), I joined Twitter and Facebook. I entered a new world of ad hominem slurs by trolls and bots.

All of this went on literally under my nose. And yet it was not until 2016 that I grasped the simple truth that all the structural changes in the public sphere brought about by the internet were more historically significant than anything else that had happened since the 1980s. The rise of Silicon Valley was more consequential than the fall of the Soviet Union, the fall of the World Trade Center, the fall of Lehman Brothers. How I had written had in fact changed much more profoundly than what I had written about.

We can learn from history because there will always be Collingwood’s tigers in the grass. But we must remember that we see the tiger more easily with binoculars than with the naked eye: technological discontinuity matters at least as much as the eternal historical verities (power corrupts, democracy turns into tyranny via demagogy, and so on). If the global war on terrorism was Vietnam reprised — if the fall of Kabul was the fall of Saigon re-enacted — it matters that the earlier event was broadcast on television and the later one on social media.

As in the 1970s, a localized failure may not matter much in the context of a cold war. In the 1970s, it was the Soviet Union that really overreached — not only by invading Afghanistan, but also by backing communists from Angola to Nicaragua, even as the Soviet economy stagnated. But what was crucial was that the U.S. not only reined in inflation but also directed its strategic resources to the technological frontier. The Soviets failed completely to computerize their economy and their defenses. Their equivalent of Arpanet was stillborn.

Just as World War II had things in common with World War I, Cold War II has things in common with Cold War I: growing ideological antagonism, technological competition, espionage, diplomatic friction, geopolitical confrontation. As Henry Kissinger has said, partly as a consequence of a pandemic that originated in China under circumstances that remain murky, we have already passed from the “foothills” to the “mountain passes” of Cold War II.

Yet there are differences. China’s economy and population are much larger than the Soviet Union’s. China’s economy is much more intertwined with ours than the autarkic Soviet economy ever was. There are vastly more Chinese studying, working and settling in the U.S. today than there were Russians in the 1980s. But the crucial difference is that there are new domains of technological competition. In addition to conventional and nuclear weapons, in addition to medicine and space, today’s superpowers are vying for leadership in green technology, artificial intelligence and quantum computing.

In the wake of the Donald Trump administration’s mishandling of Covid-19 and the Joe Biden administration’s mishandling of the exit from Afghanistan, there is a strong inclination (especially in Europe) to write off the U.S. My suspicion, however, is that China has the deeper structural problems. Its population, steadily aging, is poised to contract — steeply. Its economy is over-reliant on shaky piles of corporate debt that can grow no higher, not to mention coal-burning power stations that are increasingly in the firing line of the climate debate.

China’s one-party system is over-dependent on strident propaganda and surveillance. President Xi Jinping appears to believe that the legitimacy of the Communist Party’s monopoly on power depends on a combination of economic and social engineering at home — more semiconductors, less software; more redistribution, less computer-gaming — and international assertiveness (epitomized by “wolf-warrior” diplomacy). Yet a full-blown confrontation with the U.S.  over Taiwan would be immensely risky, not least to China’s financial stability.

The fiasco in Kabul may have dismayed America’s European allies. But how crucial are they in Cold War II, a transpacific not transatlantic conflict? Less, surely, than the other members of the emerging alliance known as the Quad — Australia, India and Japan. Europe plods onward to “ever-closer union,” its latest fiscal program (NextGenerationEU) perhaps mattering less than the departure of Britain, which always resisted such measures. 

And how attainable is the French President Emmanuel Macron’s goal of “strategic autonomy”? How many Europeans understand that their armies would last about as long as the Afghan government’s army did if, like the Afghans, they were suddenly deprived of America’s technologically superior military support? Meanwhile, Europeans face a southern border crisis that already dwarfs America’s. In the next two decades, unlike the rest of the world, Africa must grapple simultaneously with a rapidly rising population and the adverse effects of climate change, even as improving infrastructure (from roads to mobile telephony) makes long-distance migration easier.

In all that lies ahead, the decisive variables will be technological. Those in the know maintain that China is rapidly catching up with the U.S. in AI. There are fears, too, that Chinese scientists could beat their American counterparts to the “quantum supremacy.” In financial technology, it is undeniable that China leads the U.S. in electronic payments and central bank digital currency. Yet an important report published last year by Macro Polo revealed the dominance of the U.S. when it comes to attracting AI expertise from the rest of the world, including China. Moreover, China lags behind U.S. allies such as South Korea and Taiwan in high-end semiconductor manufacturing, while the U.S. is ahead of China in building blockchains, cryptocurrencies and decentralized finance. Finally, those who believed Xi’s claim last year that Chinese vaccines would save the world have been disappointed


A prophecy implies an ending, usually though not always an unhappy one. The right prophecy 20 years ago would thus have been: “America’s invasion of Afghanistan will eventually end in ignominious failure.” And the most satisfying narrative today would seem to be: “America fought terror for 20 years. Terror won.” Now roll the credits. We always want a story to have an end, unhappy or happy, because that is what cinema as well as literature lead us to expect.

But history is not a movie. History, as one of Alan Bennett’s characters in his play “The History Boys” says, is “just one f---ing thing after another.” True, historical events do sometimes have neat beginnings: Sarajevo 1914 comes to mind, a terrorist attack with far greater consequences than bin Laden ever achieved. And most wars have ends — even cold ones: think Moscow 1991. But most history is open-ended, quite unlike a book or a film — or a game of football or baseball (though those at least have the history-like quality of being unpredictable).

When will the Covid-19 pandemic end? No one can say. Perhaps, as happened with influenza, we shall pass gradually, almost imperceptibly, from pandemic to endemic. The vaccines will be more widely distributed; their efficacy should improve. The waves of hospitalization will be smaller, the excess mortality less. Perhaps decades from now, those who last year insisted that Covid was “just like the flu” will finally be right. They had just forgotten, or never knew, how our encounter with influenza began — with vast and deadly pandemics.

When will Islamic terrorism end? It goes on unabated in several African countries. When will war in Afghanistan end? Perhaps never. In many respects, the search for “closure,” which has become such a central motivation in American private life, is a futile one in the realm of history. As Kissinger memorably observed, each diplomatic success is just an admission ticket to the next crisis.

More encouragingly, each scientific breakthrough has the potential to unlock others. The ends that historians identify (if only to finish their books) are thus somewhat arbitrary, like punctuation marks inserted by a clueless editor into Molly Bloom’s stream of consciousness in James Joyce’s “Ulysses.” There is no end of history; only the history of ends.

One day we may be able to add 2021 to the history of ends, as The End of the Twenty Years’ War on Terror. But not yet. The history of ends cannot be written prematurely, for to say that this war is over is to imply some unspecified future period of peace, starting now. In other words, it’s just another prophecy, albeit an optimistic one. And, even allowing for the surprises sprung by technology, 40 years of prophesying have taught me not to make too many of those.

 

https://www.bloomberg.com/opinion/articles/2021-09-12/niall-ferguson-bad-9-11-predictions-overlooked-the-power-of-tech 

 


Don't Forget to Recommend
and Follow us at our

W3P Homepage