Although it seems like ancient history, it hasn’t been that long since economies around the world began to close down in response to the COVID-19 pandemic. Early in the crisis, most people anticipated a quick V-shaped recovery, on the assumption that the economy merely needed a short timeout. After two months of tender loving care and heaps of money, it would pick up where it left off.
It was an appealing idea. But now it is July, and a V-shaped recovery is probably a fantasy. The post-pandemic economy is likely to be anemic, not just in countries that have failed to manage the pandemic (namely, the United States), but even in those that have acquitted themselves well. The International Monetary Fund projects that by the end of 2021, the global economy will be barely larger than it was at the end of 2019, and that the US and European economies will still be about 4% smaller.
In mid-May, Brazil secured a grisly world record: it had the fastest-growing coronavirus infection rate of any country on earth. Within a month, it surpassed a million confirmed cases. This milestone made it second only to the United States in everything related to the pandemic, including total fatalities, with around a thousand people dying every day. By some estimates, Brazil may eventually see as many as thirty-four million infected and three hundred thousand dead.
The country’s far-right President, Jair Bolsonaro, has made no effort to curb the pandemic. Instead, he has belittled the threat of the virus, calling it mere “sniffles,” and responded to reports of sufferers by declaring, “We all have to die someday.” When state governors encouraged social distancing, Bolsonaro joined rallies with supporters to demonstrate against them.
Oliver Stuenkel, an associate professor of international affairs at the Getulio Vargas Foundation, in São Paulo, believes that Bolsonaro’s pandemic response is the result of a brutal calculation. “I think he looked at this and thought, This will cause a profound crisis in the Brazilian economy,” he told me. “He knows it’s hard for a Latin-American leader to remain in office with an economy that gets as bad as it is now. So, in the states where governors imposed social-distancing strictures, he’ll say the coming economic slump wasn’t his fault but theirs. If the numbers level out, he’ll say, ‘Look, it wasn’t that bad after all.’ And even if they are bad, he can easily construe some narrative that actually they really weren’t.” For the moment, Bolsonaro’s P.R. tactics seem to be working; although recent polls show rising disapproval of his performance, about thirty per cent of the population still fervently supports him, as immovable as the fans of his role model Donald Trump.
President Trump is making plain the degree to which the country remains divided by the American Civil War. His threat to veto the $718bn Defence Bill if it renames military bases called after Confederate generals harks back to 1861. His stand highlights the bizarre way that the US military has named its biggest bases, like Fort Bragg in North Carolina and Fort Hood in Texas, after Confederate generals like Braxton Bragg and John Hood who fought a war to destroy the US.
Critics suggest derisively that this tradition of naming military installations after defeated enemies should mean that future bases will include at least one named after Osama bin Laden, the founder of al-Qaeda, and another after AbuBakr al-Baghdadi,the leader of Isis, both killed by US soldiers.
The fury generated by the dispute over the renaming of the bases and the removal of the statues of Confederate commanders underlines the contemporary relevance of the outcome of the civil war. A tweet by Trump gives a clue as to why this should be the case a century and a half after the Confederate surrender. “It was sad,” Trump wrote, “to see the history and culture of our great country being ripped apart with the removal of our beautiful statues and monuments.”
On a cold morning in February 2018, a group of 30 microbiologists, zoologists and public-health experts from around the world met at the headquarters of the World Health Organization in Geneva. The group was established by the W.H.O. in 2015 to create a priority list of dangerous viruses — specifically, those for which no vaccines or drugs were already in development. The consensus, at least among those in the room, was that as populations and global travel continued to grow and development increasingly pushed into wild areas, it was almost inevitable that once-containable local outbreaks, like SARS or Ebola, could become global disasters.
“The meeting was in a big room, with all the tables arranged around the edge, facing each other,” one of the group’s members, Peter Daszak, recalled recently. “It was a very formal process. Each person was asked to present the case for including a particular disease on the list of top threats. And everything you say is being taken down, and checked factually, and recorded.”
Daszak, who directs the pandemic-prevention group EcoHealth Alliance and is also chairman of the Forum on Microbial Threats at the National Academies of Sciences, Engineering and Medicine, had been given the task of presenting on SARS, a lethal coronavirus that killed roughly 800 people after it emerged in 2002. (SARS stands for Severe Acute Respiratory Syndrome and is officially known as SARS-CoV-1.) “We’d done a lot of research on coronaviruses, so we knew they were a clear and present danger,” he told me. “High mortality, no drugs or vaccines in the pipeline, with new variants that could still be emerging.”
The discussion, he said, was intense. “Everyone else in the room knows the facts already — they’ve read all the research,” Daszak said. But for each pathogen, the speaker had to convince the room that it presented a significant threat — “that this disease really could take off, and that we should concentrate on it rather than on Lassa fever or something else. So, you argue the case, and then people vote. And sometimes it gets quite heated. I remember that monkey pox was an issue, because there are outbreaks, but there’s really nothing we can do about them. It was a really rigorous, really excellent debate — and then afterward, we went and had fondue.”
The final list — which did contain SARS and MERS, along with seven other respiratory, hemorrhagic or otherwise-lethal viruses — also included something the W.H.O. dubbed “Disease X”: a stand-in for all the unknown pathogens, or devastating variations on existing pathogens, that had yet to emerge. Daszak describes Covid-19, the disease caused by the virus SARS-CoV-2, as exactly the kind of threat that Disease X was meant to represent: a novel, highly infectious coronavirus, with a high mortality rate, and no existing treatment or prevention. “The problem isn’t that prevention was impossible,” Daszak told me. “It was very possible. But we didn’t do it. Governments thought it was too expensive. Pharmaceutical companies operate for profit.” And the W.H.O., for the most part, had neither the funding nor the power to enforce the large-scale global collaboration necessary to combat it.
In the Wall Street Journal, Dana Mattioli and Konrad Putzier speculate that the white-collar workplace as we know it might soon cease to exist.
They cite Twitter’s plan to allow its 5,000 or so staff to work from home indefinitely, along with plans by OpenText Corp to cut more than 50% of its global offices.
“Many executives …” they say, “point to the success of an unprecedented work-from-home experiment, and how little productivity appears to have been impacted after millions of employees in technology, media, finance and other industries have been forced to work remotely for months.”
Yet if we’re to understand why some 74% of corporations, according to one study, now intend to employ at least some of their staff in that way, we should recognise work from home as neither “unprecedented” nor “an experiment” but rather a method of labour organisation crucial to the development of the modern economy.
In the nineteen-sixties, Jack Nilles, a physicist turned engineer, built long-range communications systems at the U.S. Air Force’s Aerial Reconnaissance Laboratory, near Dayton, Ohio. Later, at NASA, in Houston, he helped design space probes that could send messages back to Earth. In the early nineteen-seventies, as the director for interdisciplinary research at the University of Southern California, he became fascinated by a more terrestrial problem: traffic congestion. Suburban sprawl and cheap gas were combining to create traffic jams; more and more people were commuting into the same city centers. In October, 1973, the OPEC oil embargo began, and gas prices quadrupled. America’s car-based work culture seemed suddenly unsustainable.
That year, Nilles published a book, “The Telecommunications-Transportation Tradeoff,” in which he and his co-authors argued that the congestion problem was actually a communications problem. The personal computer hadn’t yet been invented, and there was no easy way to relocate work into the home. But Nilles imagined a system that could ease the traffic crisis: if companies built small satellite offices in city outskirts, then employees could commute to many different, closer locations, perhaps on foot or by bicycle. A system of human messengers and mainframe computers could keep these distributed operations synchronized, replicating the communication that goes on within a single, shared office building. Nilles coined the terms “tele-commuting” and “telework” to describe this hypothetical arrangement.
The satellite-office idea didn’t catch on, but it didn’t matter: over the next decade, advances in computer and network technology leapfrogged it. In 1986, my mother, a COBOL programmer for the Houston Chronicle, became one of the first true remote workers: in a bid to keep her from leaving—she was very good, and had a long commute—the paper set her up with an early-model, monochrome-screen PC, from which she “dialled in” to the paper’s I.B.M. mainframe using a primitive modem, sending screens of code back and forth. “It was very slow,” she told me recently. “You would watch the lines load on the screen, one by one.” The technology wasn’t fast enough for widespread use—hours could pass while the computers synchronized—but the basic template for remote work had been set.
In the following decades, technical advances arrived with increasing frequency. In the nineteen-nineties, during the so-called I.T. revolution, office workers started using networked PCs and teams embraced e-mail and file-sharing. People began spending less time in meetings and on the phone and more time interacting with their computers. As computer prices dropped, many bought comparable machines for their homes, using modems to access the same tools they used at work. In 1994, A.T. & T. held its first “Employee Telecommuting Day”; in 1996, the federal government launched a program to increase remote-work options for its employees. In the early two-thousands, broadband Internet made home connections substantially faster, and, in 2003, a pair of European programmers released Skype, which took advantage of this broadband explosion to enable cheap audio communication. In 2004, they added conference-call capabilities, and, in 2006, video conferencing. By the next year, their program had been downloaded half a billion times.
So many monuments to racism, slavery, and colonialism have been toppled, removed, or slated for removal in the wake of the George Floyd protests that Wikipedia’s army of volunteer editors is keeping a running tally: Robert E. Lee, Jefferson Davis, and a slew of other Confederate generals and notable white supremacists and segregationists; Frank Rizzo, the notoriously racist mayor (“Vote white”) of Philadelphia; even symbolic figures like the Pioneer and Pioneer Mother, formerly of the University of Oregon in Eugene. As I write, word comes that the embarrassing statue of Theodore Roosevelt mounted on a horse and trailed by a Native American man and a black man on foot will be removed from the main entrance of New York’s Museum of Natural History.
Yesterday’s heroes are history’s villains. That nice Pope Francis thought so well of Father Junípero Serra that he canonized him in 2015, despite Native Americans’ objections to Serra’s harsh and coercive missionary work. He’s now the patron saint of California. But protesters in San Francisco and Los Angeles recently tore down his statue. As for Christopher Columbus—19 statues and counting—New York Governor Andrew Cuomo defended his presence in Manhattan’s Columbus Circle. (It “has come to represent and signify appreciation for the Italian American contribution to New York,” he said at a press conference.) But I wouldn’t bet on Chris keeping his pedestal much longer. Maybe Italian Americans could choose another compatriot, someone who brought joy to the world and didn’t massacre and enslave vast numbers of people. Like Verdi or Puccini.
THE FORMATION of Ireland’s new government on June 27th, after 140 days of haggling, brings to office a novel coalition. Not only will the old rivals of Fianna Fail and Fine Gael ally for the first time since the Irish civil war roughly a century ago, but the two parties of the centre-right will join forces with the 29-year-old Green Party. Under the new taoiseach, Micheal Martin, the coalition is promising a green new deal that would slash carbon emissions by 7% a year. Though still rare, once-improbable alliances of climate activists and conservatives are becoming increasingly fashionable in Europe. The covid-19 pandemic could well foster more such coalitions.
“Greencon” alliances are for now marriages of convenience, born of the fragmentation of European politics that is forcing parties of all stripes to contemplate new partnerships. There are areas on which greens and conservatives are unlikely ever to agree, notably defence and foreign policy. Nonetheless both sides have done a lot of evolving in recent years. And the pandemic is painting the political landscape an ever deeper shade of green, which politicians of the centre-right are as eager to exploit.
Traditionally, greens have been happier with partners to the left of centre. In Germany they joined a “red-green” government led by the Social Democratic Party (SPD) between 1998 and 2005. But in Germany and elsewhere, the greens have overtaken the old centre-left as the appeal of old-style socialism has faded and that of environmentalism has bloomed. Greens might once have been cranky idealists but have become eager to exercise power and accept the inevitable compromises that come with it.
What a terrible feeling to witness, from the front row, the collapse of a country. Although we knew that we were going to hit the wall head-on, that the shock was going to be of incredible violence and that there would be no much left of post-war Lebanon at the end, the feelings these events arouse when they occur are no less powerful. Whatever you say, you are never really prepared for the worst.
A Lebanon is dying before our eyes without us being able to do anything about it. The population is getting poorer. The country is going to be downgraded. Schools are in danger. Businesses are closing. Young people, who can, are leaving. The “Lebanese-style” way of living is threatened, as it had never been before, even during the war.
For his first three years of life, Izidor lived at the hospital.
The dark-eyed, black-haired boy, born June 20, 1980, had been abandoned when he was a few weeks old. The reason was obvious to anyone who bothered to look: His right leg was a bit deformed. After a bout of illness (probably polio), he had been tossed into a sea of abandoned infants in the Socialist Republic of Romania.
In films of the period documenting orphan care, you see nurses like assembly-line workers swaddling newborns out of a seemingly endless supply; with muscled arms and casual indifference, they sling each one onto a square of cloth, expertly knot it into a tidy package, and stick it at the end of a row of silent, worried-looking babies. The women don’t coo or sing to them.* You see the small faces trying to fathom what’s happening as their heads whip by during the wrapping maneuvers.
In his hospital, in the Southern Carpathian mountain town of Sighetu Marmaţiei, Izidor would have been fed by a bottle stuck into his mouth and propped against the bars of a crib. Well past the age when children in the outside world began tasting solid food and then feeding themselves, he and his age-mates remained on their backs, sucking from bottles with widened openings to allow the passage of a watery gruel. Without proper care or physical therapy, the baby’s leg muscles wasted. At 3, he was deemed “deficient” and transferred across town to a Cămin Spital Pentru Copii Deficienţi, a Home Hospital for Irrecoverable Children.
The cement fortress emitted no sounds of children playing, though as many as 500 lived inside at one time. It stood mournfully aloof from the cobblestone streets and sparkling river of the town where Elie Wiesel had been born, in 1928, and enjoyed a happy childhood before the Nazi deportations.
The windows on Izidor’s third-floor ward had been fitted with prison bars. In boyhood, he stood there often, gazing down on an empty mud yard enclosed by a barbed-wire fence. Through bare branches in winter, Izidor got a look at another hospital that sat right in front of his own and concealed it from the street. Real children, children wearing shoes and coats, children holding their parents’ hands, came and went from that hospital. No one from Izidor’s Cămin Spital was ever taken there, no matter how sick, not even if they were dying.
Like all the boys and girls who lived in the hospital for “irrecoverables,” Izidor was served nearly inedible, watered-down food at long tables where naked children on benches banged their tin bowls. He grew up in overcrowded rooms where his fellow orphans endlessly rocked, or punched themselves in the face, or shrieked. Out-of-control children were dosed with adult tranquilizers, administered through unsterilized needles, while many who fell ill received transfusions of unscreened blood. Hepatitis B and HIV/AIDS ravaged the Romanian orphanages.