Or rather: with unlimited power, you would know immediately what to do, if unlimited power implies unlimited intelligence and unlimited knowledge by definition. If it doesn’t, I find the concept “unlimited power” poorly defined. How can you have unlimited power without unlimited intelligence and unlimited knowledge?
The entire point of this was an analogy for creating Friendly AI. The AI would have absurd amounts of power, but we have to decide what we want it to do using our limited human intelligence.
I suppose you could just ask the AI for more intelligence first, but even that isn’t a trivial problem. Would it be ok to alter your mind in such a way that it changes your personality or your values? Is it possible to increase your intelligence without doing that? And tons of other issues trying to specify such a specific goal.
The entire point of this was an analogy for creating Friendly AI. The AI would have absurd amounts of power, but we have to decide what we want it to do using our limited human intelligence.
I suppose you could just ask the AI for more intelligence first, but even that isn’t a trivial problem. Would it be ok to alter your mind in such a way that it changes your personality or your values? Is it possible to increase your intelligence without doing that? And tons of other issues trying to specify such a specific goal.