then EY would be conceding that morality is a weakness, or at the very least that strength and strength alone will determine which AI will win.
I’m pretty sure that he does believe that if an AI goes FOOM, it’s going to win, period, moral or no. The idea that an AI would not simply be more preferable, but actually win over another AI on account of being more moral strikes me as, well, rather silly, and not at all in accordance with what I think Eliezer actually believes.
I’m pretty sure that he does believe that if an AI goes FOOM, it’s going to win, period, moral or no. The idea that an AI would not simply be more preferable, but actually win over another AI on account of being more moral strikes me as, well, rather silly, and not at all in accordance with what I think Eliezer actually believes.