I agree the easy vs hard worlds influence the chance of AI taking over.
But are you also claiming it influences the badness of takeover conditional on it happening? (That’s the subject of my post)
I think it affects both, since alignment difficulty determines both the probability that the AI will have values that cause it to take over, as well as the expected badness of those values conditional on it taking over.
I agree the easy vs hard worlds influence the chance of AI taking over.
But are you also claiming it influences the badness of takeover conditional on it happening? (That’s the subject of my post)
I think it affects both, since alignment difficulty determines both the probability that the AI will have values that cause it to take over, as well as the expected badness of those values conditional on it taking over.