Short answer is that they’re nice people, and they understand that power corrupts, so they can’t even rationalize wanting to be king of the universe for altruistic reasons.
Also, a post-Singularity future will probably (hopefully) be absolutely fantastic for everyone, so it doesn’t matter whether you selfishly get the AI to prefer you or not.
Short answer is that they’re nice people, and they understand that power corrupts, so they can’t even rationalize wanting to be king of the universe for altruistic reasons.
Also, a post-Singularity future will probably (hopefully) be absolutely fantastic for everyone, so it doesn’t matter whether you selfishly get the AI to prefer you or not.