The closest thing that we have in real life to the ‘rational agent’ concept in game theory and artificial intelligence are psychopaths. Taking this idea further, it’s easy to see why a rational superintelligence would become a UFAI—it is a psychopath.
This doesn’t quite seem right and here is why; my utility function considers others people’s utility function, therefore by acting rationally and maximizing my utility function leaves room for empathy of others. You only get psychopathy if the utility function of the rational agent, is psychopathic, most people’s utility functions are not.
This doesn’t quite seem right and here is why; my utility function considers others people’s utility function, therefore by acting rationally and maximizing my utility function leaves room for empathy of others.
Yes this is what I said. Usually in game theory setups though, empathy is not included in the utility function. That’s what I meant, sorry if it was unclear. You’re right that an agent can be rational and empathic at the same time.
This doesn’t quite seem right and here is why; my utility function considers others people’s utility function, therefore by acting rationally and maximizing my utility function leaves room for empathy of others. You only get psychopathy if the utility function of the rational agent, is psychopathic, most people’s utility functions are not.
Yes this is what I said. Usually in game theory setups though, empathy is not included in the utility function. That’s what I meant, sorry if it was unclear. You’re right that an agent can be rational and empathic at the same time.
Then the game theory experiments leaving no room for empathy are straw Vulcans.