These don’t seem very relevant counterarguments, I think literally all are from people who believe that AGI is an extinction-level threat soon facing our civilization.
Perhaps you mean “>50% of extinction-level bad outcomes” but I think that the relevant alternative viewpoint that would calm someone is not that the probability is only 20% or something, but is “this is not an extinction-level threat and we don’t need to be worried about it”, for which I have seen no good argument for (that engages seriously with any misalignment concerns).
Well, I was asking because I found Yudkowsky’s model of AI doom far more complete than any other model of the long term consequences of AI. So the point of my original question is “how frequently is a model that is far more complete than it’s competitors wrong?”.
But yeah, even something as low as 1% chance of doom demands very large amount of attentions from the human race (similar to the amount of attention we assigned to the possibility of nuclear war).
(That said, I do think the specific value of p(doom) is very important when deciding which actions to take, because it effects the strategic considerations in the play to your outs post.)
These don’t seem very relevant counterarguments, I think literally all are from people who believe that AGI is an extinction-level threat soon facing our civilization.
Perhaps you mean “>50% of extinction-level bad outcomes” but I think that the relevant alternative viewpoint that would calm someone is not that the probability is only 20% or something, but is “this is not an extinction-level threat and we don’t need to be worried about it”, for which I have seen no good argument for (that engages seriously with any misalignment concerns).
Well, I was asking because I found Yudkowsky’s model of AI doom far more complete than any other model of the long term consequences of AI. So the point of my original question is “how frequently is a model that is far more complete than it’s competitors wrong?”.
But yeah, even something as low as 1% chance of doom demands very large amount of attentions from the human race (similar to the amount of attention we assigned to the possibility of nuclear war).
(That said, I do think the specific value of p(doom) is very important when deciding which actions to take, because it effects the strategic considerations in the play to your outs post.)