Let’s say I have a utlity function and a finite map from actions to utilities. (Actions are things like moving a muscle or writing a bit to memory, so there’s a finite number.)
One day, the utility of all actions becomes the same. What do I do? Well, unlike Asimov’s robots, I won’t self-destructively try to do everything at once. I’ll just pick an action randomly.
The result is that I move in random ways and mumble gibberish. Althogh this is perfectly voluntary, it bears an uncanny resemblance to a seizure.
Regardless of what else is in a machine with such a utility function, it will never surpass the standard of intelligence set by jellyfish.
Let’s say I have a utlity function and a finite map from actions to utilities. (Actions are things like moving a muscle or writing a bit to memory, so there’s a finite number.)
One day, the utility of all actions becomes the same. What do I do? Well, unlike Asimov’s robots, I won’t self-destructively try to do everything at once. I’ll just pick an action randomly.
The result is that I move in random ways and mumble gibberish. Althogh this is perfectly voluntary, it bears an uncanny resemblance to a seizure.
Regardless of what else is in a machine with such a utility function, it will never surpass the standard of intelligence set by jellyfish.