People (realistically) believe that the being the Gatekeeper, and being the AI is terribly hard (or impossible, before it was shown to simply be terribly hard in most cases).
Imagine though that we’ve got a real transhuman/AI around to play with, or that we ourselves are transhuman. Would this paradigm then be inverted? Would everybody want to be the AI, with only the extremely crafty of us daring to be (or to pretend to be) Gatekeeper?
If Eliezer’s claim is correct—that anyone can be convinced to let the AI out—then the true test of ability should be to play Gatekeeper. The AI’s position would be trivially easy.
People (realistically) believe that the being the Gatekeeper, and being the AI is terribly hard (or impossible, before it was shown to simply be terribly hard in most cases).
Imagine though that we’ve got a real transhuman/AI around to play with, or that we ourselves are transhuman. Would this paradigm then be inverted? Would everybody want to be the AI, with only the extremely crafty of us daring to be (or to pretend to be) Gatekeeper?
If Eliezer’s claim is correct—that anyone can be convinced to let the AI out—then the true test of ability should be to play Gatekeeper. The AI’s position would be trivially easy.
… perhaps.