Is it completely inhuman though? Dogs are mammals, and therefore our evolutionary cousins. They have a lot in common with humans. They’re also naturally social like their ancestral wolves, and unlike them, have furthermore co-evolved with humans in a mutualistic way.
Would you grant as much moral weight to a pet tarantula?
It’s solitary, less intelligent, and much more distantly related to us than a dog. It’s not clear if it’s conscious at all. Even if it is, it’s almost certainly not self-aware the way a human is. It seems to me that most of the moral weight is due to it being a pet, because a human cares about it, in a similar way we might put moral weight on an object with sentimental value, not because the object has inherent value, but because a human cares about it.
If it wasn’t a pet, and was big enough to eat you, would you have the slightest compunction against killing it first?
An AI indifferent to human values is going to be even more alien than the giant spider. For game-theoretic/deontological reasons, I claim that a moral agent has a lot more right to being a moral patient than an inherently hostile entity, even if it is technically conscious.
I grant moral weight to my pet dog merely based on it being conscious, even though it is completely inhuman.
Is it completely inhuman though? Dogs are mammals, and therefore our evolutionary cousins. They have a lot in common with humans. They’re also naturally social like their ancestral wolves, and unlike them, have furthermore co-evolved with humans in a mutualistic way.
Would you grant as much moral weight to a pet tarantula?
It’s solitary, less intelligent, and much more distantly related to us than a dog. It’s not clear if it’s conscious at all. Even if it is, it’s almost certainly not self-aware the way a human is. It seems to me that most of the moral weight is due to it being a pet, because a human cares about it, in a similar way we might put moral weight on an object with sentimental value, not because the object has inherent value, but because a human cares about it.
If it wasn’t a pet, and was big enough to eat you, would you have the slightest compunction against killing it first?
An AI indifferent to human values is going to be even more alien than the giant spider. For game-theoretic/deontological reasons, I claim that a moral agent has a lot more right to being a moral patient than an inherently hostile entity, even if it is technically conscious.