Never in our lifetimes, it seems, has there been greater uncertainty about the future — and greater ignorance of the past.
At the beginning of 2020, very few people grasped the significance of the news coming out of Wuhan about a new coronavirus. When I first spoke and wrote publicly about the rising probability of a global pandemic, in the week ending Jan. 26, 2020, I was regarded as eccentric (certainly by the majority of the delegates at the World Economic Forum in Davos, who seemed oblivious to the danger). The conventional wisdom at that time, from Fox News to The Washington Post, was that the coronavirus posed a lesser threat to Americans than the usual winter wave of influenza.
On Feb. 2, I wrote, “We are now dealing with an epidemic in the world’s most populous country, which has a significant chance of becoming a global pandemic. ... The challenge is ... to resist that strange fatalism that leads most of us not to cancel our travel plans and not to wear uncomfortable masks, even when a dangerous virus is spreading exponentially.”
Looking back, I read those sentences as a veiled confession.
I was traveling manically in January and February, as I had done for most of the previous 20 years. In January, I flew from London to Dallas, from Dallas to San Francisco, and from there to Hong Kong (Jan. 8), Taipei (Jan. 10), Singapore (Jan. 13), Zurich (Jan. 19), back to San Francisco (Jan. 24), and then to Fort Lauderdale (Jan. 27). I wore a mask once or twice but found it intolerable after an hour and took it off.
In the course of February, I flew almost as frequently, though not so far: to New York, Sun Valley, Bozeman, Washington and Lyford Cay in the Bahamas. You may wonder what kind of life that was. I used to joke that the lecture circuit had turned me into an “international man of history.” I realized only later that I might have been one of the “superspreaders” whose hyperactive travel schedules were spreading the virus from Asia to the rest of the world.
My weekly newspaper column in the first half of 2020 became a kind of plague diary, though I never mentioned the fact that I was ill for most of February, with a painful cough I could not shake off. (To get through lectures, I relied heavily on the alternative medicine that some call Scotch.)
“Worry about grandparents,” I wrote on Feb. 29; “the mortality rate for people in their eighties is above 14%, whereas it’s close to zero for those under 40.” I omitted the less comforting data on asthmatic men in their mid-50s. I also left out the fact that I went to see a doctor twice only to be told that — as more or less everywhere in the United States at that time — there were no tests available for COVID-19. All I knew was that it was serious, and not only for me and my family.
As I wrote on March 1: “Those who blithely say, ‘This is no worse than the flu’ ... are missing the point. ... Uncertainty surrounds it because it is so difficult to detect in its early stages, when many carriers are both infectious and asymptomatic. We don’t know for sure how many people have it, so we don’t exactly know its reproduction number and its mortality rate. There’s no vaccine and there’s no cure.”
In another article, published in The Wall Street Journal on March 8, I wrote, “If the U.S. turns out to have proportionately as many cases as South Korea, it will soon have some 46,000 cases and more than 300 deaths — or 1,200 deaths if the U.S. mortality rate is as high as Italy’s.”
At that point, total confirmed cases in the U.S. stood at just 541; deaths at 22. We passed 46,000 cases on March 24 and 1,200 deaths on March 25, just over two weeks later. On March 15, I noted, “John F. Kennedy airport was thronged yesterday with people doing what, since time immemorial, they have done in times of plague: fleeing the big city (and spreading the virus). ... We are entering the panic phase of the pandemic.”
That was the same day I myself flew, with my wife and my two youngest children, from California to Montana. We stayed there for a year.
I wrote and thought about little else in the first half of 2020. Why this intense preoccupation? The answer is that, although my core competency is financial history, I have been keenly interested in the role of disease in history ever since studying the Hamburg cholera epidemic of 1892 as a graduate student more than 30 years ago. Richard Evans’ meticulously detailed study of that episode introduced me to the idea that the mortality caused by a deadly pathogen is partly a reflection of the social and political order it attacks. It was the class structure as much as the bacterium Vibrio cholerae that killed people in Hamburg, Germany, Evans argued, because the entrenched power of the city’s property owners had been an insuperable obstacle to improving the city’s antiquated water and sewage systems. The mortality rate for the poor was 13 times higher than for the rich.
Researching “The Pity of War” a few years later, I was struck by statistics that suggested the German army had collapsed in 1918 partly because of a surge of illness, possibly resulting from the Spanish influenza pandemic. “The War of the World” (Penguin, 2006) delved more deeply into the history of the 1918-19 pandemic, showing how the First World War ended with twin pandemics — not only influenza but also the ideological contagion of Bolshevism.
The work I did on empires in the 2000s also involved excursions into the history of contagious disease. No account of European settlement in the New World could have omitted the role that disease played in “thinning the Indians, to make room for the English,” as John Archdale, the governor of Carolina in the 1690s, callously remarked. (The title of the second chapter of my book “Empire” was “White Plague.”)
I was also very struck by the terrible toll of tropical disease on British soldiers stationed far from home: A man’s chances of surviving a tour of duty in Sierra Leone were pitifully low — 1 in 2. “The Great Degeneration” (Penguin, 2014) explicitly warned of our growing vulnerability to “the ... random mutation of viruses like influenza,” while “The Square and the Tower” (Penguin, 2017) was essentially a history of the world based on the insight that “network structures are as important as viruses in determining the speed and extent of a contagion.”
The COVID-19 pandemic is far from over. The death toll worldwide approaches five million, which is certainly an underestimate, as the statistics from a number of large countries (notably Iran and Russia) cannot be trusted. And the cumulative body count continues to rise globally — to say nothing of the number of people whose health has been permanently damaged, which no one has yet estimated. It seems increasingly likely that Lord Martin Rees, Britain’s astronomer royal, has won his bet with the Harvard psychologist Steven Pinker that “bioterror or bioerror will lead to one million casualties in a single event within a six month period starting no later than Dec 31 2020.”
Some epidemiologists have argued that, without drastic social distancing and economic lockdowns, the ultimate death toll could have been between 30 million and 40 million. Because of government restrictions and changes in public behavior, it will surely not be as high.
Yet precisely these “nonpharmaceutical interventions” have inflicted a shock on the world economy far greater than that caused by the 2008-09 financial crisis — potentially as great as the shock of the Great Depression, but compressed into months, not years. Governments had to take drastic fiscal measures to stave off a collapse of the markets.
You see, disasters and their impact are rarely entirely exogenous events, with the exception of a massive meteor strike, which hasn’t happened in 66 million years, or an alien invasion, which hasn’t happened at all. Even a catastrophic earthquake is only as catastrophic as the extent of urbanization along the fault line — or the shoreline, if it triggers a tsunami.
A pandemic is made up of a new pathogen and the social networks that it attacks. We cannot understand the scale of the contagion by studying only the virus itself, because the virus will infect only as many people as social networks allow it to. At the same time, a catastrophe lays bare the societies and states that it strikes. It is a moment of truth, of revelation, exposing some as fragile, others as resilient, and others as “antifragile” — able not just to withstand disaster but to be strengthened by it.
We still find ourselves in the middle of this pandemic, about which it’s hard to gain the perspective that hindsight affords. But looking at past pandemics and disasters not only helps provide perspective on the current one, but also allow us to understand and prepare for the inevitability of the next one.
So cometh now
My Lady Influenza, like a star
Inebriously wan, and in her train
Fever, the haggard soul’s white nenuphur,
And lily-fingered Death, and grisly Pain
And Constipation who makes all things vain,
Pneumonyer, Cancer, and Nasal Catarrh.
Rupert Brooke’s “To My Lady Influenza” (1906) was a facetious undergraduate work. Yet “Lady Influenza” was never to be trifled with. The first well-described influenza outbreaks were in 16th-century Europe, but the earliest was probably in 1173.
There had been significant influenza pandemics in the 18th and 19th centuries, but the 20th century was to be hit much harder. A more populous world was also a more urban world and a more mobile world — a world in which low air quality in industrial towns may have made people more susceptible to respiratory illnesses. A year after Brooke wrote “To My Lady Influenza,” his eldest brother, Dick, died of pneumonia at the age of 26. Brooke himself would live only a year longer, dying of an infected mosquito bite that led to sepsis, off the Greek island of Skyros.
A predictable “gray rhino,” in the sense that danger of a general European war was well known, but also a surprising “black swan,” in the sense that contemporaries seemed bewildered by its outbreak, the First World War was a true dragon-king event in terms of its vast historical consequences.
As disastrous as the war was, its proximate impact in terms of lives lost was exceeded by that of the influenza pandemic that broke out in its final year. Where exactly the new strain of H1N1 first appeared is uncertain, but it is usually said to have been Fort Riley, Kansas, the site of Camp Funston, one of the network of Army camps where hundreds of thousands of young American men were being trained to fight in Europe as the American Expeditionary Forces. There is, however, evidence that the pandemic originated in the British Army in 1917, though the condition was initially identified as “purulent bronchitis with bronchopneumonia.”
Here was the key to influenza’s 20th-century success. Never had armies been mobilized on such a scale before — more than 70 million men in uniform. Never had so many young men been taken from their homes and workplaces, crowded into primitive accommodations, and sent over long distances in ships and trains. The idea that the virus originated in pigs has been refuted (an avian origin seems more likely); if anything, the direction of infection was from men to pigs.
The first American cases were recorded at Camp Funston on March 4. A week later, a member of Fort Riley’s catering staff entered the infirmary, followed in the coming days by a stream of infected soldiers. By the end of the month, more than a thousand cases had been recorded and 48 men had died of influenza.
As if to mock the efforts of men to kill one another, the virus spread rapidly across the United States and then crossed to Europe on the jam-packed American troopships. It is possible that the pandemic explains the near doubling of the proportion of German soldiers reporting sick in the summer of 1918, which was a crucial factor in the imperial army’s subsequent collapse. Certainly, we have reports of German prisoners of war with the flu from July. By that time it had reached India, Australia and New Zealand. A few months later, a second and deadlier wave struck all but simultaneously in Brest, France; Freetown, Sierra Leone; and Boston.
The virus made a new landfall in the United States at Boston’s Commonwealth Pier on Aug. 27, 1918, when three cases of influenza appeared on the sick list. Eight cases emerged the next day, and 58 the next, 15 of whom were so ill they were transferred to the U.S. Naval Hospital in Chelsea.
On Sept. 8, influenza arrived at the Army’s Camp Devens. Within 10 days, thousands of feverish patients overwhelmed the camp’s hospitals; within weeks, the morgue was full of blue-tinged, asphyxiated corpses.
Combatant countries sought to suppress the news of the pandemic as potentially harmful to wartime morale; this hardly helped to keep the public informed. The disease came to be known as the Spanish flu because only the largely uncensored press of neutral Spain reported on it with any accuracy.
Between 40 million and 50 million people died as a result of the pandemic, the majority of them suffocated by a lethal accumulation of blood and other fluid in the lungs.
The absolute numbers were highest in India (18.5 million deaths) and China (between 4 million and 9.5 million), but death rates varied widely from place to place. Close to half (44.5%) of the population of Cameroon was wiped out; in Western Samoa, nearly a quarter (23.6%). The worst mortality rates in Europe were in Hungary and Spain (each around 1.2%), with Italy not far behind. By contrast, North America got off lightly: between 0.53% and 0.65% for the United States, 0.61% for Canada.
As these figures imply, the Spanish flu was indifferent as to a country’s combatant status. While its initial spread may have been related to wartime accommodation and transportation, that soon ceased to be true.
In the United States, the deaths of as many as 675,000 people were attributed to the Spanish flu, of which 550,000 were excess deaths (above what would have been expected in that period under normal circumstances). Equivalent mortality in 2020 would have been between 1.8 million and 2.2 million Americans. The Spanish flu killed an order of magnitude more Americans than died in combat in the war (53,402).
Ironically, unlike most flu epidemics, but like the war that preceded and spread it, the influenza of 1918 disproportionately killed young adults. Out of 272,500 male influenza deaths in the United States, nearly 49% were aged 20 to 39, whereas only 18% were under five and 13% were over 50. The very young and very old were also (as usual) vulnerable, so that all countries for which age-specific death rates are available recorded a roughly W-shaped age distribution of mortality. Death was not caused by the influenza virus itself so much as by the body’s immunological reaction to the virus.
Perversely, this meant that individuals with the strongest immune systems were more likely to die than those with weaker immune systems. A good illustration of the impact of the pandemic on young adults, as well as a vivid description of the hallucinogenic miseries of the illness itself, can be found in Katherine Anne Porter’s short story “Pale Horse, Pale Rider” (1937), about a wartime romance cut cruelly short by the virus.
In Sept. 2021, COVID-19’s American death toll overtook that of the 1918 flu. And yet, in Aug. 2020, it appeared that, with a regime of mass testing, contact tracing, social distancing and targeted quarantining, a country could contain the spread of SARS-CoV-2, as the virus relied heavily on superspreaders for its transmission and disproportionately sickened or killed people past retirement age. Only in the spring of 2021 did it join the elite of pandemics — the 20 or so in recorded history that killed upward of 0.05% of humanity.
Only a handful of those nations that fought World War II have lost more people per day to COVID-19 than they have lost to the Axis powers. The United States is one of those countries. This illustrates something profound — that all disasters are at some level man-made political disasters, even if they originate with new pathogens.
Politics explained why World War II killed 25 times as many Germans as Americans. Politics also helps to explain why COVID-19 has thus far killed more than eight times as many Americans as Germans.
This plague began as a gray rhino, predicted by many. It struck as a black swan, somehow completely unforeseen. Could it become a world-changing “dragon king”?
As we have seen, disasters of any kind become truly epoch-making events only if their economic, social and political ramifications amount to more than the excess mortality they cause. Could this medium-size disaster nevertheless alter our lives permanently and profoundly? Let me now hazard three guesses.
First, COVID-19 will be to social life what AIDS was to sexual life: It will change our behavior, though by no means enough to avert a significant number of premature deaths. I myself welcome a new age of social distancing, but then I am a natural misanthrope who hates crowds and will not greatly miss hugs and handshakes. Most people, however, will be unable to resist the temptations of post-lockdown gregariousness. There will be unsafe socializing just as there still is unsafe sex, even after more than three decades and 30 million deaths from HIV.
Second, and for that reason, most big cities are not “over.” Do we all now head from Gotham or the Great Wen to the villages, there to cultivate our vegetable gardens in splendid, rustic isolation? Do nearly half of us continue to work from home, as we did during the pandemic — more than three times more than before? Probably not.
It takes a lot to kill a city. True, just over a century after Thomas Mann wrote “Death in Venice” (1912), Venice is pretty much dead. But it was not cholera that killed it — it was the shifting pattern of international trade. Likewise, COVID-19 will not kill London or New York; it will just make them cheaper, grungier and younger. Some billionaires will not return. Some firms and many families will move to the suburbs or even farther afield. Tax revenues will drop. Crime rates will jump. As Gerald Ford supposedly did in 1975, when the city asked for a federal bailout, another president may tell New York to “drop dead.” San Francisco will lose talent to Austin.
But inertia is a powerful thing. Americans these days relocate less than they used to. Only a third of jobs can really be done at home; everyone else will still need to work in offices, shops and factories. Workplaces will just be different — more spacious and campus-like, as they already are in Silicon Valley. Commuting will no longer involve being packed like sardines on a subway. No more unwelcome intimacies on elevators. Masks over most faces. No more tut-tutting at the hijab and the niqab. Perforce, we are all modest now.
What of the pandemic’s impact on the generational imbalances that had grown so intolerable in many societies by 2020? Was COVID-19 sent by Freya, the goddess of youth, to emancipate millennials and Generation Z from bearing the fiscal burden of an excessive number of elderly people?
It is tempting to marvel at this ageist virus. No previous pandemic was so discriminating against the elderly and in favor of the young. But in truth, the impact of COVID-19 in terms of excess mortality will probably not be great enough to balance the intergenerational accounts. In the short run, the majority of old people will remain retired; relatively few will die prematurely — hardly any in the most elderly of countries, Japan.
The young, meanwhile, will be the ones struggling to find jobs (other than with Amazon) and struggling almost as much to have fun. An economy without crowds is not a “new normal.” It may be more like the new anomie, to borrow Émile Durkheim’s term for the sense of disconnectedness he associated with modernity. For most young people, the word “fun” is almost synonymous with “crowd.” The era of distancing will be a time of depression in the psychological as well as the economic sense. The gloom has been especially deep for Generation Z, whose university social lives — half the point of college, if not more — were wrecked. They spent yet more time on electronic devices — perhaps an hour a day more than before the pandemic. It did not make them happier.
History tells us to expect the great punctuation marks of disaster in no predictable order. The four horsemen of the Book of Revelation — Conquest, War, Famine and the pale rider Death — gallop out at seemingly random intervals to remind us that no amount of technological innovation can make mankind invulnerable.
Indeed, some innovations — like those fleets of jet airplanes that transported so many infected people from Wuhan to the rest of the world in Jan. 2020 — give the horsemen the opportunity to ride in their slipstream. Yet somehow the riders’ arrival always takes us by surprise.
For a moment, we contemplate the scenario of total extinction. We shelter in place, watching “Contagion” or reading Atwood. Perhaps the black swan becomes a dragon king and turns life upside down. But very rarely. Mostly, for the lucky many, life after the disaster goes on, changed in a few ways but on the whole remarkably, reassuringly, boringly the same. With astonishing speed, we put our brush with mortality behind us and blithely carry on, forgetful of those who were not so lucky, regardless of the next disaster that lies in wait.
Think, if you doubt the truth of this, of Daniel Defoe’s concluding doggerel from his “Journal of the Plague Year”:
A dreadful Plague in London was,
In the Year Sixty Five,
Which swept an Hundred Thousand Souls
Away; yet I alive!
Adapted from “Doom: The Politics of Catastrophe,” copyright 2021 by Niall Ferguson. Used by permission of Penguin Press, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC. All rights reserved.