I don’t know! I’ve certainly seen people say P(doom) is 1, or extremely close. And anyway, bombing an AI lab wouldn’t stop progress, but would slow it down—and if you think there is a chance alignment will be solved, the more time you buy the better.
If you think P(doom) is 1, you probably don’t believe that terrorist bombing of anything will do enough damage to be useful. That is probably one of EYs cruxes on violence.
But when you say extinction will be more likely, you must believe that the probability of extinction is not 1.
Well… Yeah? Would any of us care to build knowledge that improves our odds if our odds were immovably terrible?
I don’t know! I’ve certainly seen people say P(doom) is 1, or extremely close. And anyway, bombing an AI lab wouldn’t stop progress, but would slow it down—and if you think there is a chance alignment will be solved, the more time you buy the better.
If you think P(doom) is 1, you probably don’t believe that terrorist bombing of anything will do enough damage to be useful. That is probably one of EYs cruxes on violence.