Updated: Apr 15, 2022
Previously, we outlined the catastrophic risks and the existential risks that threaten the long-term future of humanity. In this third article on existential and catastrophic risks, we will make some key observations about the probability of each type of risk. We will also talk about how we can contribute to trying to avoid this type of risk. Finally, we will list some web pages and research institutes to read and learn more about this topic.
Not all risks are the same
Although we have mentioned a large number of catastrophic and existential risks, we must not make the grave mistake of thinking that they are all equally likely to occur, or equally serious. There are certain events that are potentially hundreds of times more likely than others. There are events that are hundreds of times more dangerous than others. And finally, there are certain events in which we can intervene or contribute a hundred or a thousand times more efficiently than others.
Recently, Oxford University philosopher Toby Ord has used a large body of research material in a variety of study areas and disciplines to assign the following probabilities to the following existential risk scenarios, that is, the probability that the following events end up causing the extinction of humanity in the next hundred years. [i]
Approximate probability that it will cause our extinction in the next hundred years
Asteroid or comet impact
1 in 1.000.000
1 in 10.000
1 in 1.000.000.000
Total natural risk
1 in 10.000
1 in 1.000
1 in 1.000
Other environmental damage
1 in 1.000
"Naturally" arising pandemics
1 in 10.000
1 in 30
Unaligned artificial intelligence
1 in 10
Unforeseen anthropogenic risks
1 in 30
Other anthropogenic risks
1 in 50
Total anthropogenic risk
1 in 6
Total existential risk
1 in 6
What do these statistical approximations teach us? The most important lesson to be learned from them is that man-made risks (anthropogenic risks) are vastly higher than natural risks. This means that the most likely cause of humanity's extinction is some technology created by humanity itself.
Although there is a risk of extinction by natural means, if we compare those risks with the risk from anthropogenic causes, we will realize that we have time to deal with natural risk. If an existentially hazardous meteorite is going to fall on our planet in half a million years, humanity may not even be living on Earth at all, or we will probably have an extremely effective technology to deal with it.
By comparison, we hardly have time - a few decades, or at most, a few centuries - to deal with anthropogenic risks. If Ord's estimates are roughly correct, humanity has a 1 in 6 chance of extinction every century. This is equivalent to rolling a die every century, where getting between 1 and 5 means surviving another century, but getting a 6 means becoming extinct, killing every single human being on the planet and/or ending all the potential that humanity could have developed for the rest of time. This should motivate us to invest resources and effort to reduce the current level of anthropogenic risk.
In addition, we do not have any evolutionary or cultural mechanisms that allow us to learn from these risks, since a single event of existential magnitude would end us all. Generally, humanity begins to learn to deal with a problem after they make a mistake, and thus learn their lesson for the next time a similar event occurs. However, humanity will never be able to learn its lesson after an existential catastrophe, since a single event of this type will end humanity forever.
What can we do?
The situation is even more worrying when we realize that, annually, we spend more money on ice cream than on preventing all these catastrophic and existential risks. That is, we care more about ice cream than protecting humanity from extinction. [ii] The international organization dedicated to regulating biological weapons has an annual income level of one and a half million dollars, less money than the cost of running the average McDonald's restaurant. [iii]
Therefore, we can safely say that not nearly enough attention is currently being paid to these issues.
How to collaborate with the prevention of catastrophic and existential risks
These are an outline of some particularly effective methods for reducing catastrophic and existential risks:
Abandon the "trial and error" method for these types of risks and use prospective and statistical methods instead.
Founding new institutions to reinforce prevention and resilience in the face of these types of risks, as well as developing methods, models, and programs to deal with these types of risks.
Establish international protocols for the prevention of existential risks. Such protocols could be established as extensions of already established rights. For example, given that the right to life is considered fundamental as a Human Right and as a right protected by the constitutions of the countries, it can be used as a basis for the justification of the importance of achieving the survival of us both as individuals and as a species. Therefore, we could justify international mechanisms for the reduction of these risks on the criterion of the protection of the right to life.
Work for and donate to institutions for the prevention of catastrophic or existential risks. It is crucial to reduce the level of risk by financially reinforcing risk reduction institutions and programs, investing our labor or intellectual energies (our work or our research) in the prevention of these risks.
Vote for and promote political candidates who might have these risks in mind to reduce them, not increase them. Many politicians have great influence on issues of nuclear safety, climate change prevention, as well as allocating great amounts of funding to develop safe technologies. Certain political changes lead the scientists of the Bulletin of Atomic Scientists to calibrate their Doomsday Clock as very close to a catastrophic event. The level has been adjusted year after year, and today we are closer than ever.
In particular for catastrophic risks, work on response strategies and resilience to such events. For example, institutions that are dedicated to ensuring our basic needs (food, water, transportation, electricity) in the event of a catastrophe.
Some pages to read more about existential and catastrophic risks are as follows.
Global Catastrophic Risk at Wikipedia.
Existential Risks at 80,000 Hours contains a good summary of existential risks.
Existential-Risk.org, Nick Bostrom's page on existential risks.
There are also several research institutes on these risks, including:
Future of Humanity Institute (FHI), at Oxford, run by philosopher Nick Bostrom. Toby Ord, the author mentioned above, also works here.
Forethought Foundation, in Oxford, run by the Philosopher William MacAskill.
Center for the study of existential Risk (CSER), in Cambridge, led by astronomer Martin Rees, among others.
Future of Life Institute (FLI), led by Jaan Tallinn, founder of Skype.
Center on Long-Term Risk, at London.
Some of the most influential books on existential and catastrophic risks include the following:
Toby Ord (2020) The Precipice.
Martin Rees (2003) Our Final Hour.
Richard Posner (2004) Catastrophe: Risk and Response.
Nick Bostrom (ed.) (2008) Global Catastrophic Risks.
John Leslie (1996) The End of the World: The Science and Ethics of Human Extinction.
Nick Bostrom (2002) Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards.
Nick Beckstead (2013) On the Overwhelming Importance of Shaping the Far Future.
Of course, there are also books and articles that analyze a specific existential risk, such as the possibility of nuclear war or misaligned artificial intelligence. We will elaborate on these particular catastrophic and existential risks in the future.
[i] Ord, Toby. (2020) The Precipice. p. 167. There are other estimates by some research institutions. However, Ord's estimate is more up-to-date and influential.
[ii] Ord, Toby. (2020) The Precipice p. 58.
[iii] Source: Eighth Review Conference of the States Parties to the Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction.