It’s not just lack of empathy that makes psychopaths act as they do, but also what looks to me like a particularly strong drive for domination. They simply get more emotional rewards out of exerting power over other people, in ways that range from winning socially acceptable status contests to torturing and butchering other humans. Without this drive for domination, they wouldn’t be half as dangerous.
Just pointing out that it’s the presence rather than the absence of a feature that causes one to be actively evil, not just selfish and calculating. Merely self-interested rational agents would stop at callously pursuing whatever their, er, utility function tells them to. They wouldn’t go that extra mile to satisfy a purely emotional need. To exhibit psychopathic behavior—to play mind games with people, to break laws and to engage in power contests even when you don’t have anything rational to gain, just for the thrill of it—well, you need to be able to feel the thrill of it. An extra feature.
As for programming emotion into an AI, I wouldn’t know about that. I have the vague intuition that emotions are a bit of a kludge-y solution to morality; our emotional system is mildly good some of the time, but not great and not all of the time, at getting morality right. A different emotional system, designed from scratch and checked for coherence, might perform better, though I don’t have the qualifications needed to express an opinion one way or the other.
So you’re saying that included in their goals is an explicit urge for dominance, that is absent (or weakened) in ‘normal’ human beings? I suppose it sounds plausible, but I’d like to see some references.
It’s not just lack of empathy that makes psychopaths act as they do, but also what looks to me like a particularly strong drive for domination. They simply get more emotional rewards out of exerting power over other people, in ways that range from winning socially acceptable status contests to torturing and butchering other humans. Without this drive for domination, they wouldn’t be half as dangerous.
Just pointing out that it’s the presence rather than the absence of a feature that causes one to be actively evil, not just selfish and calculating. Merely self-interested rational agents would stop at callously pursuing whatever their, er, utility function tells them to. They wouldn’t go that extra mile to satisfy a purely emotional need. To exhibit psychopathic behavior—to play mind games with people, to break laws and to engage in power contests even when you don’t have anything rational to gain, just for the thrill of it—well, you need to be able to feel the thrill of it. An extra feature.
As for programming emotion into an AI, I wouldn’t know about that. I have the vague intuition that emotions are a bit of a kludge-y solution to morality; our emotional system is mildly good some of the time, but not great and not all of the time, at getting morality right. A different emotional system, designed from scratch and checked for coherence, might perform better, though I don’t have the qualifications needed to express an opinion one way or the other.
So you’re saying that included in their goals is an explicit urge for dominance, that is absent (or weakened) in ‘normal’ human beings? I suppose it sounds plausible, but I’d like to see some references.
Most people have it to a small extent (it’s a feature present all across the animal kingdom, after all), but in psychopaths it is exacerbated.
Sorry, no references. Just a speculation that seems strongly consistent with what I know so far about them.