Global Catastrophic and Existential Risks

According to the blog post schedule that I published when I launched my Race to 40! challenge, today’s post is supposed to be related to the novella that I am writing “2030 ET: Tribulation”.

 

In last week’s 2030 ET related post, I revealed that the “ET” in the title of the novella stands for “End Times”. There are two reasons the novella (hopefully, to become a book if there is sufficient demand) has “End Times” in the title: the first is because the protagonist’s father, Solomon Raind, believes that the Biblical End Times will begin in 2030. The second is that the protagonist, Arthur Raind, believes that various global catastrophic and existential risks that face the world are coming to a head and will manifest themselves in the decade of 2030.

 

As such, today’s post will be a primer on “global catastrophic and existential risks.”

 

What is are Global Catastrophic and Existential Risks?

 

The Wikipedia page on “global catastrophic risks” has the following definition: “A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale. Some events could cripple or destroy modern civilization. Any event that could cause human extinction is also known as an existential risk.”

 

Philosopher Nick Bostrom, a leading authority on the subject, classifies risks according to their scope and intensity.

Global Catastrophic Risks Scope Intensity Classification
Global Catastrophic Risks Scope vs. Intensity Classification

 

A “global catastrophic risk” is any risk that is at least “global” in scope and is “endurable” in intensity. Those that are at least “trans-generational” (affecting all future generations) in scope and “terminal” in intensity are classified as existential risks.

 

Another definition from the “Global Challenges Foundation” is any risk that can eliminate 10% of the population. No points for guessing what percentage of the population dies in an existential risk.

 

Another parameter that is vitally important is the probability of the risk occurring. A macabre approach to ranking the risks could be developed based on the expected fatalities: e.g. the number of fatalities if the event occurred * the probability of the event occurring.

 

According to the Global Catastrophic Risks Survey conducted by the “Future of Humanity Institute” in 2008, there is a 19% chance of human extinction over the next century. Take a look at the probabilities the experts assigned to these existential risks:

Existential Risk Probabilities
Existential Risk Probabilities

 

There are other exogenous risks that are not included in the above table (e.g. a super-volcano erupting, an asteroid impact) which are discussed in the Global Priorities Project annual report (see link at the end of this post).

 

Arthur Raind is particularly concerned about four risks in 2030, two existential and two catastrophic: 1) super-intelligent AI, 2) engineered pandemics, 3) severe climate change, and 4) food scarcity (caused by ocean ecology collapse).

 

What do you think? Are there any risks missing from the above table? Do you agree with the probabilities assigned to the risks? Leave a comment!

 

Bonus Reference: Download the “Global Challenges Foundation” annual report here to read more about these risks that face us: http://globalprioritiesproject.org/wp-content/uploads/2016/04/Global-Catastrophic-Risk-Annual-Report-2016-FINAL.pdf

adrian_jonklaas

Aspiring Author and Entrepreneur.

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.