Coronavirus Reveals
the Downsides of Urbanization
Crises have a way of shocking us out of complacency to consider how fragile and vulnerable civilization still is, and always will be. In the short run, that means retreating to humanity’s basic survival impulses and taking a triage approach to our priorities. In the long run, however, COVID-19 should prompt some reflection on our vulnerabilities and how to limit them in the future. One in particular bears rethinking: our ever-growing urban concentration and dependence on high-density, centrally managed mass transit.
The relentless march of urbanization, in the United States and around the world, has been coming for a long time. Using the Census Bureau’s expansive definition of an “urban area” as 2,500 or more people, America went from 8.8 percent urban in 1830 to 25.7 percent in 1870, then to a majority in 1920, and up to about two-thirds by the mid-1950s. We were 80 percent urban by 2010. North America has the most urban population in the world. But it is not alone in seeing an accelerating trend. The U.N. estimated that, in 2009, half the world’s population lived in urban areas for the first time in human history. Over 4 billion people live in cities today, six times as many as did in 1950. In 2000, there were 371 cities of a million or more people in the world; by 2018, that number was 548. The global and American trends go beyond what you would expect simply as the natural outcome of population growth.
There are undoubted advantages to urban life. Concentrating large numbers of people in small areas means larger workforces with more diverse skills, easier access to mass transit, and economies of scale in everything from public services to cultural institutions, such as museums and sports teams. Even the things that let us stay at home — from internet service to grocery and take-out delivery — are easier to get in cities. Those dynamics explain much of why this is a longstanding global phenomenon.
But the dark side of urbanization has always included infectious disease. Humans did not evolve to live in such close proximity. Close physical contact spreads germs, which is why medieval and early-modern cities were so pestilential. London became the first city to break two million people in the early 1800s, and it suffered terrible outbreaks of cholera (then a brand-new disease) in the following decades. While sanitation has solved many of the old problems of disease, apartment buildings and mass transit still force people together in much closer quarters than houses and cars. And today, the most densely packed Western cities face the greatest risk, with Paris and San Francisco taking the extreme step of “shelter-in-place” orders, and New York’s mayor openly pondering the same thing.
Disease is far from the only risk of concentrating people and critical institutions in crowded spaces. Terrorist attacks are disproportionately aimed at cities, where easy targets range from landmark buildings (the World Trade Center) to packed trains (Madrid). Natural disasters and blackouts are more costly and dangerous when they hit the biggest cities. Hurricane Katrina was vastly more damaging to New Orleans than to the rest of the Gulf Coast, in part because many of the city’s residents did not own cars and could not evacuate under their own power.
Historically, nations have also been more vulnerable to internal unrest and external invasion if they relied on one or two large metropolitan areas as a nerve center. In France, for example, taking Paris has always brought the rest of the nation to its knees. From 1789 to 1848, the Parisian mob could bring down governments and impose its will on the countryside. Napoleon, schooled in decisive battles for capital cities, invaded Russia expecting that he could do the same and was startled to learn that capturing Moscow earned him nothing while his opponents retreated into Russia’s vast expanse. Baron Haussman, the great redesigner of Paris in the 1850s and 1860s, rebuilt it around broad boulevards, whose beauty for strolling tourists camouflaged their true purpose in affording long fields of fire for French artillery to clear the streets of unruly Parisians. That reduced the internal threat of mob rule, but, ironically, helped German armies end national resistance by capturing Paris in 1871 and 1940.
The American Revolution, in contrast to the French, dragged on for seven years. At one point or another, the then-principal cities of most of the colonies — Massachusetts (Boston), Pennsylvania (Philadelphia), New York (New York), Virginia (Williamsburg, Richmond, and Charlottesville), Rhode Island (Newport), South Carolina (Charleston), Georgia (Savannah), Delaware (Wilmington) and New Jersey (Trenton, Princeton, Perth Amboy, and New Brunswick) — were captured and occupied by the British. But in a country where no one city was essential, none of those captures ended the fight.
A little over four score years later, that was no longer true of America’s rebels. The Confederacy made Richmond its capital, and both combatants bent much of their military strategy around its capture or defense, in large part because the Confederacy’s industrial capacity was so heavily concentrated in one city. Lose the Tredegar Iron Works, lose the war. Robert E. Lee’s army surrendered a week after Richmond fell.
Subway and train systems, for all their benefits, can be incapacitated by attacks on their controls and stations in ways that cars cannot. Satellite-guided self-driving cars would face the same problem without human autonomy to free them from the vulnerabilities of centralization. Our reliance on over-concentrated satellite and Internet networks in general should be evaluated. The more distributed our sources of authority, control, communication, and transportation are, the less vulnerable we are to any sort of natural or man-made decapitation strike.
The accumulation of jobs in big cities has also undermined the environmental and lifestyle promise of urbanization due to “suburban sprawl,” the large stretches of outlying bedroom communities whose residents must endure long commutes to reach their jobs. Time spent in transit is a deadweight economic and quality-of-life loss, and driving cars to avoid mass transit only exacerbates pollution and traffic accidents. Urban life’s boosters blame this on suburbia, but the underlying cause is that too many jobs are concentrated in small spaces that are prohibitively expensive to live in, especially for families who want the living space offered by suburbia. Most suburbanites would gladly keep their houses and ditch the commute to work closer to home. The quarantine life may teach them how much of their jobs could be done without the commute.
The lesson is not that we should do away with cities. Nor could we. Government policy and social attitudes operate only around the margins of large trends, such as urbanization, anyway. The majority-agrarian society of Thomas Jefferson’s ideal is not coming back. But after the experience of moving large segments of our workforce to home-based work on short notice, we should give more thought to the benefits of walkable towns and smaller cities where more people can work from or near home among smaller crowds of their fellow man. We should consider the reduction of our dependence on megacities to concentrate business and life in dense, high-cost, low-social-distance areas. Smaller urban areas can promote not only more individual independence and security, but also greater community of the sort our society has been losing as it increasingly atomizes in larger cities.
The day we need these steps may come sooner than we think. It may already be here.