Image by Comfreak from Pixabay

Our choice: learn the lesson from COVID or get flattened by the next curve

Climate change and unaligned AI pose a greater existential risk than a pandemic. Could we muster the will to prevent them?

Andrew Chakhoyan
5 min readOct 22, 2020

--

A decade and a half ago, in August of 2005, an unprecedented disaster hit the U.S. Gulf Coast. Even though its impact was more localized than the upheaval we are facing now, Hurricane Katrina caused tremendous devastation and suffering. There wasn’t much we could do to hold off the storm’s landfall, but the maintenance of levees, investment in water management, early allocation of resources commensurate with the level of risk was surely up to us.

averting a highly probable calamity is what we must, by definition, do before the event

The estimated economic cost of Katrina exceeded $161 billion, or 30 times the budget of the U.S. government agency tasked with disaster mitigation ($5.5 billion). This was one among many warnings the world failed to heed — averting a highly probable calamity is what we must, by definition, do before the event.

The importance of flattening the curve is the lesson we’ve learned from COVID19. But what about climate change, sweeping environmental degradation, out-of-control AI? Existential risks abound, and if there is a silver lining in the present crisis, it is the realization that we must think ahead and come together as one humanity to effectively address them. Instead of waiting for the next catastrophe to hit, it’s incumbent upon us to work on prevention.

The order of magnitude difference that the climate crisis presents, as opposed to a pandemic, is brilliantly visualized by Raf Schoenmaekers.

Surely, reckoning the precise scale of devastation the warming of our planet will cause, or the total human cost of a coronavirus outbreak, which is far from over, is a fool’s errand. But what’s worse is getting paralyzed by the fear of making a wrong prediction and avoiding a comparative analysis altogether.

In an effort to mitigate the economic impact of COVID19, governments around the world have allocated over 7 trillion dollars. Such prompt and decisive action should be applauded. But there’s got to be some recognition and regret for the preventive measures not taken.

Moody had estimated the potential economic damage from the rising temperatures caused by carbon emissions at ten times the costs of dealing with the coronavirus outbreak.

It is high time to start flattening the curve on an impending climate emergency. And the added bonus is that we could quadruple the returns within a decade. The has calculated that “$1.8 trillion in investment by 2030 concentrated in five categories — weather warning systems, infrastructure, dry-land farming, mangrove protection and water management — would yield $7.1 trillion in benefits.”

Fighting COVID19 with everything we’ve got, driving the carbon emissions down, working on biodiversity preservation would be a good place to start, but, on a broader scale, preventing a major cataclysm is a public awareness and resource allocation challenge. With so many issues requiring attention, how do we know on which ones to focus first?

But all other risks pale in comparison with the unaligned artificial intelligence, which is a thousand times more likely to end human existence.

Toby Ord, the author of The Precipice: Existential Risk and the Future of Humanity has a few ideas. In his assessment, the chances of an existential catastrophe caused by climate change are ten times as high as that of a naturally arising pandemic. But all other risks pale in comparison with the unaligned artificial intelligence, which is a thousand times more likely to end human existence.

Absent robust global cooperation, the race-to-the-bottom dynamic is bound to kick in. If the objective is to develop AI before your rival, the idea of proceeding with caution goes straight out the window.

The choice of how things develop appears to be ours, at least for the time being.

Unlike a once-in-a-generation pandemic, AI is not going to ‘naturally occur.’ The choice of how things develop appears to be ours, at least for the time being. Many leading thinkers in this space advocate for greater global cooperation and for boosting research into AI safety. As with Climate Change, the sooner we allocate our attention and resources to this issue the higher our chances for success.

We have billions and billions of dollars now invested in making AI more powerful and almost nothing in AI safety research. No governments of the world have said that AI safety research should be an integral part of their computer science funding, and it’s like, why would you fund building nuclear reactors without funding nuclear reactor safety?” This is how Max Tegmark — an MIT professor and a co-founder of Future of Life Institute — characterized the present conundrum in an interview with .

Given the entirely different category of risk that unaligned artificial intelligence presents, the question of economic fallout seems irrelevant. Should intelligence explosion be a reachable milestone, it will pose a truly existential challenge to humankind that no amount of ex-post funding could ever address.

If we fail to learn this lesson from coronavirus, humanity will find itself flattened by the next curve it brings upon itself.

When asked about life after COVID19 on podcast, Bill Gates summed it up: “I hope that this draws the world together.” I couldn’t agree more. A pandemic sees no borders, and defeating it one country at a time is neither practical nor cost-efficient. We need to move beyond the hope of global cooperation towards decisive action. Climate change, engineered pandemics, out-of-control AI, and other anthropogenic risks all pose a serious threat to human existence, and to avoid another Hurricane Katrina or be taken by surprise when the next pandemic hits, we have to start collectively allocating commensurate resources to mitigate those threats. And we must do it now. If we fail to learn this lesson from coronavirus, humanity will find itself flattened by the next curve it brings upon itself.

*a version of this article first appeared on the blog.

--

--

Andrew Chakhoyan

Global citizen, idealist, optimist, keynote speaker. Founder of SNConsulting.nl Write for @WEF and @Futurism. Thanks for following 🙏