No AGI research org has enough evil to play it that way.
We shouldn’t just assume this, though. Power corrupts. Suppose that you are the CEO of an AI company, and you want to use the AGI your company is developing to fulfill your preferences and not anyone else’s. Sit down and think for a few minutes about what obstacles you would face, and how you as a very clever person might try to overcome or subvert those obstacles.
Sit down and think for a few minutes about what obstacles you would face
I’ve thought about it a little bit, and it was so creepy that I don’t think a person would want to keep thinking these thoughts: It would make them feel dirty and a little bit unsafe, because they know that the government, or the engineers that they depend on, have the power to totally destroy them if they were caught even exploring those ideas. And doing these things without tipping off the engineers you depend on is extremely difficult, maybe even impossible given the culture we have.
We shouldn’t just assume this, though. Power corrupts. Suppose that you are the CEO of an AI company, and you want to use the AGI your company is developing to fulfill your preferences and not anyone else’s. Sit down and think for a few minutes about what obstacles you would face, and how you as a very clever person might try to overcome or subvert those obstacles.
I’ve thought about it a little bit, and it was so creepy that I don’t think a person would want to keep thinking these thoughts: It would make them feel dirty and a little bit unsafe, because they know that the government, or the engineers that they depend on, have the power to totally destroy them if they were caught even exploring those ideas. And doing these things without tipping off the engineers you depend on is extremely difficult, maybe even impossible given the culture we have.