What are catastrophic and existential risks? - Part III: Importance and Bibliography
Updated: Apr 15, 2022

Previously, we outlined the catastrophic risks and the existential risks that threaten the long-term future of humanity. In this third article on existential and catastrophic risks, we will make some key observations about the probability of each type of risk. We will also talk about how we can contribute to trying to avoid this type of risk. Finally, we will list some web pages and research institutes to read and learn more about this topic.
Not all risks are the same
Although we have mentioned a large number of catastrophic and existential risks, we must not make the grave mistake of thinking that they are all equally likely to occur, or equally serious. There are certain events that are potentially hundreds of times more likely than others. There are events that are hundreds of times more dangerous than others. And finally, there are certain events in which we can intervene or contribute a hundred or a thousand times more efficiently than others.
Recently, Oxford University philosopher Toby Ord has used a large body of research material in a variety of study areas and disciplines to assign the following probabilities to the following existential risk scenarios, that is, the probability that the following events end up causing the extinction of humanity in the next hundred years. [i]
Existential Risk | Approximate probability that it will cause our extinction in the next hundred years |
Natural Risks | |
Asteroid or comet impact | 1 in 1.000.000 |
Supervolcanic eruption | 1 in 10.000 |
Stellar explosion | 1 in 1.000.000.000 |
Total natural risk | 1 in 10.000 |
| |
Anthropogenic Risk | |
Nuclear war | 1 in 1.000 |
Climate change | 1 in 1.000 |
Other environmental damage | 1 in 1.000 |
"Naturally" arising pandemics | 1 in 10.000 |
Engineered pandemics | 1 in 30 |
Unaligned artificial intelligence | 1 in 10 |
Unforeseen anthropogenic risks | 1 in 30 |
Other anthropogenic risks | 1 in 50 |
Total anthropogenic risk | 1 in 6 |
| |
Total | |
Total existential risk | 1 in 6 |
What do these statistical approximations teach us? The most important lesson to be learned from them is that man-made risks (anthropogenic risks) are vastly higher than natural risks. This means that the most likely cause of humanity's extinction is some technology created by humanity itself.