The overall risk was 9.2% for the community forecast (with 7.3% for AI risk). To convert this to a forecast for existential risk (100% dead), I assumed 6% risk from AI, 1% from nuclear war, and 0.4% from biological risk
I think this implies you think:
AI is ~4 or 5 times (6% vs 1.3%) as likely to kill 100% of people as to kill between 95 and 100% of people
Everything other than AI is roughly equally likely (1.5% vs 1.4%) to kill 100% of people as to kill between 95% and 100% of people
Does that sound right to you? And if so, what was your reasoning?
I ask out of curiosity, not because I disagree. I don’t have a strong view here, except perhaps that AI is the risk with the highest ratio of “chance it causes outright extinction” to “chance it causes major carnage” (and this seems to align with your views).
The Metaculus community forecast has chance of >95% dead (7.5%) close to chance of >10% dead (9.7%) for AI. Based on this and my own intuition about how AI risks “scale”, I extrapolated to 6% for 100% dead. For biological and nuclear war, there’s a much bigger drop off from >10% to >95% from the community. It’s hard to say what to infer from this about the 100% case. There are good arguments that 100% is unlikely from both, but some of those arguments would also cut against >95%. I didn’t do a careful examination and so take all these numbers with a grain of salt.
I think this implies you think:
AI is ~4 or 5 times (6% vs 1.3%) as likely to kill 100% of people as to kill between 95 and 100% of people
Everything other than AI is roughly equally likely (1.5% vs 1.4%) to kill 100% of people as to kill between 95% and 100% of people
Does that sound right to you? And if so, what was your reasoning?
I ask out of curiosity, not because I disagree. I don’t have a strong view here, except perhaps that AI is the risk with the highest ratio of “chance it causes outright extinction” to “chance it causes major carnage” (and this seems to align with your views).
The Metaculus community forecast has chance of >95% dead (7.5%) close to chance of >10% dead (9.7%) for AI. Based on this and my own intuition about how AI risks “scale”, I extrapolated to 6% for 100% dead. For biological and nuclear war, there’s a much bigger drop off from >10% to >95% from the community. It’s hard to say what to infer from this about the 100% case. There are good arguments that 100% is unlikely from both, but some of those arguments would also cut against >95%. I didn’t do a careful examination and so take all these numbers with a grain of salt.