What you’re referring to is a problem I’ve been thinking about and chipping away at for some time; I’ve even had some discussions about it here and people have generally been receptive. Maybe the reason you’re being downvoted is that you’re using the word ‘human’ to mean ‘good’.
The core issue is that humans have empathy, and by this we mean that other people’s utility function matters to us. More concisely, our perception of other people’s utility forms a part of our utility which is conditionally independent of the direct benefits to us.
Our empathy not only extends to other humans, but also animals and perhaps even robots.
So what are examples of human beings who lack empathy? Lacking empathy is basically the definition of psychopathy. And, indeed, some psychopaths (not all, but some) have been violent criminals who e.g. killed babies for money, tortured people for amusement, etc. etc.
So you’re essentially right that a game theory where the players do not have models of each other’s utility functions shows aspects of psychopathy and ‘inhumanity’.
But that doesn’t mean game theory is wrong or ‘inhuman’! All it means is that you’re missing the ‘empathy’ ingredient. It also means that it would not be a good idea to build an AI without empathy. That’s exactly what CEV attempts to solve. CEV is basically a crude attempt at trying to instill empathy in a machine.
Yes, that was what I was getting at. Like I said elsewhere—game theory is not evil. It’s just horrifyingly neutral. I am not using inhuman as bad; I am using inhuman as unfriendly.
What you’re referring to is a problem I’ve been thinking about and chipping away at for some time; I’ve even had some discussions about it here and people have generally been receptive. Maybe the reason you’re being downvoted is that you’re using the word ‘human’ to mean ‘good’.
The core issue is that humans have empathy, and by this we mean that other people’s utility function matters to us. More concisely, our perception of other people’s utility forms a part of our utility which is conditionally independent of the direct benefits to us.
Our empathy not only extends to other humans, but also animals and perhaps even robots.
So what are examples of human beings who lack empathy? Lacking empathy is basically the definition of psychopathy. And, indeed, some psychopaths (not all, but some) have been violent criminals who e.g. killed babies for money, tortured people for amusement, etc. etc.
So you’re essentially right that a game theory where the players do not have models of each other’s utility functions shows aspects of psychopathy and ‘inhumanity’.
But that doesn’t mean game theory is wrong or ‘inhuman’! All it means is that you’re missing the ‘empathy’ ingredient. It also means that it would not be a good idea to build an AI without empathy. That’s exactly what CEV attempts to solve. CEV is basically a crude attempt at trying to instill empathy in a machine.
Yes, that was what I was getting at. Like I said elsewhere—game theory is not evil. It’s just horrifyingly neutral. I am not using inhuman as bad; I am using inhuman as unfriendly.
Then you must be horrified by all science.