I don’t know! I’ve certainly seen people say P(doom) is 1, or extremely close. And anyway, bombing an AI lab wouldn’t stop progress, but would slow it down—and if you think there is a chance alignment will be solved, the more time you buy the better.
If you think P(doom) is 1, you probably don’t believe that terrorist bombing of anything will do enough damage to be useful. That is probably one of EYs cruxes on violence.
I don’t know! I’ve certainly seen people say P(doom) is 1, or extremely close. And anyway, bombing an AI lab wouldn’t stop progress, but would slow it down—and if you think there is a chance alignment will be solved, the more time you buy the better.
If you think P(doom) is 1, you probably don’t believe that terrorist bombing of anything will do enough damage to be useful. That is probably one of EYs cruxes on violence.