It’s been asserted here that “the core distinction between avoidance, pain, and awareness of pain works” or that “there is such a thing as bodily pain we’re not conciously aware of”. This, I think, blurs and confuses the most important distinction there is in the world—namely the one between what is a conscious/mental state and what is not. Talk of “sub-conscious/non-conscious mental states” confuses things too: If it’s not conscious, then it’s not a mental state. It might cause one or be caused by one, but it isn’t a mental state.
Regarding the concept of “being aware of being in pain”: I can understand it as referring to a second-order mental state, a thought with the content that there is an unpleasant mental state going on (pain). But in that sense, it often happens that I am not (second-order) aware of my stream of consciousness because “I” am totally immersed in it, so to speak. But the absence of second-order mental states does not change the fact that first-order mental states exist and that it feels like something (and feels good or bad) to be in them (or rather: to be them). The claim that “no creature was ever aware of being in pain” suggests that for most non-human animals, it doesn’t feel like anything to be in pain and that, therefore, such pain-states are ethically insignificant. As I said, I reject the notion of “pain that doesn’t consciously feel like anything” as confused: If it doesn’t feel like anything, it’s not a mental state and it can’t be pain. And there is no reason for believing that first-order (possibly painful and thus ethically significant) mental states require second-order awareness. At the very least, we should give non-human animals the benefit of the doubt and assign a significant probability to their brain states being mental and possibly painful and thus ethically significant.
Last but not least, there is also an argument (advanced e.g. by Dawkins) to the effect that pain intensity and frequency might even be greater in less intelligent creatures: “Isn’t it plausible that a clever species such as our own might need less pain, precisely because we are capable of intelligently working out what is good for us, and what damaging events we should avoid? Isn’t it plausible that an unintelligent species might need a massive wallop of pain, to drive home a lesson that we can learn with less powerful inducement? At very least, I conclude that we have no general reason to think that non-human animals feel pain less acutely than we do, and we should in any case give them the benefit of the doubt.”
It’s been asserted here that “the core distinction between avoidance, pain, and awareness of pain works” or that “there is such a thing as bodily pain we’re not conciously aware of”. This, I think, blurs and confuses the most important distinction there is in the world—namely the one between what is a conscious/mental state and what is not. Talk of “sub-conscious/non-conscious mental states” confuses things too: If it’s not conscious, then it’s not a mental state. It might cause one or be caused by one, but it isn’t a mental state.
Regarding the concept of “being aware of being in pain”: I can understand it as referring to a second-order mental state, a thought with the content that there is an unpleasant mental state going on (pain). But in that sense, it often happens that I am not (second-order) aware of my stream of consciousness because “I” am totally immersed in it, so to speak. But the absence of second-order mental states does not change the fact that first-order mental states exist and that it feels like something (and feels good or bad) to be in them (or rather: to be them). The claim that “no creature was ever aware of being in pain” suggests that for most non-human animals, it doesn’t feel like anything to be in pain and that, therefore, such pain-states are ethically insignificant. As I said, I reject the notion of “pain that doesn’t consciously feel like anything” as confused: If it doesn’t feel like anything, it’s not a mental state and it can’t be pain. And there is no reason for believing that first-order (possibly painful and thus ethically significant) mental states require second-order awareness. At the very least, we should give non-human animals the benefit of the doubt and assign a significant probability to their brain states being mental and possibly painful and thus ethically significant.
Last but not least, there is also an argument (advanced e.g. by Dawkins) to the effect that pain intensity and frequency might even be greater in less intelligent creatures: “Isn’t it plausible that a clever species such as our own might need less pain, precisely because we are capable of intelligently working out what is good for us, and what damaging events we should avoid? Isn’t it plausible that an unintelligent species might need a massive wallop of pain, to drive home a lesson that we can learn with less powerful inducement? At very least, I conclude that we have no general reason to think that non-human animals feel pain less acutely than we do, and we should in any case give them the benefit of the doubt.”