This nonlinearity also seems strange to have, without also accepting quantum-immortality-type arguments. In particular, you only need to bargain for UFAIs to kill all humans painlessly and instantaneously; and then you just simulate those same humans yourself. (And if you want to save on compute, you can flip quantum coins for a bit.) Maybe it makes sense to have this nonlinearity but not accept this—I’d be curious to see what that position looks like.
This nonlinearity also seems strange to have, without also accepting quantum-immortality-type arguments. In particular, you only need to bargain for UFAIs to kill all humans painlessly and instantaneously; and then you just simulate those same humans yourself. (And if you want to save on compute, you can flip quantum coins for a bit.) Maybe it makes sense to have this nonlinearity but not accept this—I’d be curious to see what that position looks like.