I think this is a great metaphor of what a boxed AI would be able to do to the gatekeeper.
http://lesswrong.com/lw/qk/that_alien_message/
That one is a good read, but way too noisy, wordy and convoluted for the simple point that OP’s making.
Ironically, it’s what a programmer does to AI when stepping through it with a debugger.
I feel like a new term should be coined. “Philosopher’s AI”, sort of like philosopher’s stone or philosophical mercury.
I think this is a great metaphor of what a boxed AI would be able to do to the gatekeeper.
http://lesswrong.com/lw/qk/that_alien_message/
That one is a good read, but way too noisy, wordy and convoluted for the simple point that OP’s making.
Ironically, it’s what a programmer does to AI when stepping through it with a debugger.
I feel like a new term should be coined. “Philosopher’s AI”, sort of like philosopher’s stone or philosophical mercury.