Actually, I’m an eliminativist about phenomenal states. I wouldn’t be completely surprised to learn that the illusion of phenomenal states is restricted to humans, but I don’t think that this illusion is necessary for one to be a moral patient. Suppose we encountered an alien species whose computational substrate and architecture was so exotic that we couldn’t rightly call anything it experienced ‘pain’. Nonetheless it might experience something suitably pain-like, in its coarse-grained functional roles, that we would be monsters to start torturing members of this species willy-nilly.
My views about non-human animals are similar. I suspect their psychological states are so exotic that we would never recognize them as pain, joy, sorrow, surprise, etc. (I’d guess this is more true for the positive states than the negative ones?) if we merely glimpsed their inner lives directly. But the similarity is nonetheless sufficient for our taking their alien mental lives seriously, at least in some cases.
So, I suspect that phenomenal pain as we know it is strongly tied to the evolution of abstract thought, complex self-models, and complex models of other minds. But I’m open to non-humans having experiences that aren’t technically pain but that are pain-like enough to count for moral purposes.
RobbBB, in what sense can phenomenal agony be an “illusion”? If your pain becomes so bad that abstract thought is impossible, does your agony—or the “illusion of agony”—somehow stop? The same genes, same neurotransmitters, same anatomical pathways and same behavioural responses to noxious stimuli are found in humans and the nonhuman animals in our factory-farms. A reasonable (but unproven) inference is that factory-farmed nonhumans endure misery—or the “illusion of misery” as the eliminativist puts it—as do abused human infants and toddlers.
But I’m open to non-humans having experiences that aren’t technically pain but that are pain-like enough to count for moral purposes.
I guess maybe I just didn’t understand how you were using the term “pain”—I agree that other species will feel things differently, but being “pain-like enough to count for moral purposes” seems to be the relevant criterion here.
Actually, I’m an eliminativist about phenomenal states. I wouldn’t be completely surprised to learn that the illusion of phenomenal states is restricted to humans, but I don’t think that this illusion is necessary for one to be a moral patient. Suppose we encountered an alien species whose computational substrate and architecture was so exotic that we couldn’t rightly call anything it experienced ‘pain’. Nonetheless it might experience something suitably pain-like, in its coarse-grained functional roles, that we would be monsters to start torturing members of this species willy-nilly.
My views about non-human animals are similar. I suspect their psychological states are so exotic that we would never recognize them as pain, joy, sorrow, surprise, etc. (I’d guess this is more true for the positive states than the negative ones?) if we merely glimpsed their inner lives directly. But the similarity is nonetheless sufficient for our taking their alien mental lives seriously, at least in some cases.
So, I suspect that phenomenal pain as we know it is strongly tied to the evolution of abstract thought, complex self-models, and complex models of other minds. But I’m open to non-humans having experiences that aren’t technically pain but that are pain-like enough to count for moral purposes.
RobbBB, in what sense can phenomenal agony be an “illusion”? If your pain becomes so bad that abstract thought is impossible, does your agony—or the “illusion of agony”—somehow stop? The same genes, same neurotransmitters, same anatomical pathways and same behavioural responses to noxious stimuli are found in humans and the nonhuman animals in our factory-farms. A reasonable (but unproven) inference is that factory-farmed nonhumans endure misery—or the “illusion of misery” as the eliminativist puts it—as do abused human infants and toddlers.
I guess maybe I just didn’t understand how you were using the term “pain”—I agree that other species will feel things differently, but being “pain-like enough to count for moral purposes” seems to be the relevant criterion here.