The Wikipedia page on “global catastrophic risks” has the following definition: “A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale. Some events could cripple or destroy modern civilization. Any event that could cause human extinction is also known as an existential risk.” Another definition from the “Global Challenges Foundation” is any risk that can eliminate 10% of the population. No points for guessing what percentage of the population dies in an existential risk.
Arthur Raind is particularly concerned about four risks in 2030, two existential and two catastrophic: 1) super-intelligent AI, 2) engineered pandemics, 3) severe climate change, and 4) food scarcity (caused by ocean ecology collapse).
What do you think are the most significant risks that humankind needs to prepare for over the next 15 years?