True—though maybe some consider “major catastrophe causing the collapse of civilization as we know it” as falling under existential risk, even if it would take much more than that to actually put mankind in danger.
I wonder if Demski would actually give a high probability of human extinction because of global warming, or whether it’s just that he used a broad interpretation of “existential risk”.
It seems like a very reasonable position that
global warming is more likely to cause massive deaths than AI, but
AI is more likely to exterminate mankind than global warming
The term “existential risks” is in the question being asked. I think it should count as context.
True—though maybe some consider “major catastrophe causing the collapse of civilization as we know it” as falling under existential risk, even if it would take much more than that to actually put mankind in danger.
I wonder if Demski would actually give a high probability of human extinction because of global warming, or whether it’s just that he used a broad interpretation of “existential risk”.
Yea, I have to admit that when I wrote that I meant “lower on my list of concerns for the next century”.
Global warming is surely fluff—even reglaciation poses a bigger risk.