I mean something morally meaningful. I don’t think a chess computer suffers when it loses a game, no matter how sophisticated. I expect that self-driving cars are programmed to try to avoid accidents even when other drivers drive badly, but I don’t think they suffer if you crash into them.
Well, I wouldn’t usually call the thing a chess computer or a self-driving car is minimizing “suffering” (though I could if I feel like using more anthropomorphizing language than usual). But I’m confused by this, because I have no problem using that word to refer to a sensation felt by a chimp, a dog, or even an insect, and I’m not sure what is that an insect has and a chess computer hasn’t that causes this intuition of mine. Maybe the fact that we share a common ancestor, and our nociception capabilities are synapomorphic with each other… but then I think even non-evolutionists would agree a dog can suffer, so it must be something else.
I mean something morally meaningful. I don’t think a chess computer suffers when it loses a game, no matter how sophisticated. I expect that self-driving cars are programmed to try to avoid accidents even when other drivers drive badly, but I don’t think they suffer if you crash into them.
Yeah, if by “suffering” you mean “nociception I care about”, it sure is human-specific.
I’d find this more informative if you explicitly addressed my examples?
Well, I wouldn’t usually call the thing a chess computer or a self-driving car is minimizing “suffering” (though I could if I feel like using more anthropomorphizing language than usual). But I’m confused by this, because I have no problem using that word to refer to a sensation felt by a chimp, a dog, or even an insect, and I’m not sure what is that an insect has and a chess computer hasn’t that causes this intuition of mine. Maybe the fact that we share a common ancestor, and our nociception capabilities are synapomorphic with each other… but then I think even non-evolutionists would agree a dog can suffer, so it must be something else.