What information about cat brains can I possibly learn to make me classify them as “non-persons”?
Do you value conscious experience in yourself more than unconscious perception with roughly the same resulting external behavior? Then it is conceivable that empathy is mistaken about what kind of system is receiving inputs in cat’s case and there is at least difference in value depending on internal organization of cat’s brain.
I’m struggling to think of a good example for this? Usually conscious experience causes at least one difference in external behavior, namely that I might tell you about it if you ask me. Cats can’t talk, which does affect my attitude towards cats, but I don’t think my empathy somehow fails to take it into account?
But you don’t value conscious experience because you told me, right? Or you don’t value it proportionally to external behavior. Then that’s another intuition about personhood that you will need to include, so you’ll interpolate from “conscious parts of me—person”, “unconscious parts of me—non-person”, “rock—non-person”, and may decide that cats are more like unconscious parts of you.
I object to the classification “conscious parts of me—person”, “unconscious parts of me—non-person”. I think that personhood is more like a collective property of the whole than residing in just the “conscious parts”. And, I don’t think my caring-about-myself is pointing towards only the “conscious parts”. I agree that cats might lack a part that humans have which has something to do with consciousness (with the important caveat that “consciousness” is an ill-defined term that probably refers to different things in different contexts), and this probably reduces the amount I care about them, but it still leaves a lot of me-caring-about-them.
So like “humans − 1.5”, “cats − 1.0“, “rocks − 0.0” instead of “1.0, 0.0, 0.0”? Ok then, sounds consistent. Someone might object that we call caring about non-conscious stuff “aesthetic preferences”, but I don’t see how caring about cat’s inner life usually expressed by behaviour is different.
Do you value conscious experience in yourself more than unconscious perception with roughly the same resulting external behavior? Then it is conceivable that empathy is mistaken about what kind of system is receiving inputs in cat’s case and there is at least difference in value depending on internal organization of cat’s brain.
I’m struggling to think of a good example for this? Usually conscious experience causes at least one difference in external behavior, namely that I might tell you about it if you ask me. Cats can’t talk, which does affect my attitude towards cats, but I don’t think my empathy somehow fails to take it into account?
But you don’t value conscious experience because you told me, right? Or you don’t value it proportionally to external behavior. Then that’s another intuition about personhood that you will need to include, so you’ll interpolate from “conscious parts of me—person”, “unconscious parts of me—non-person”, “rock—non-person”, and may decide that cats are more like unconscious parts of you.
I object to the classification “conscious parts of me—person”, “unconscious parts of me—non-person”. I think that personhood is more like a collective property of the whole than residing in just the “conscious parts”. And, I don’t think my caring-about-myself is pointing towards only the “conscious parts”. I agree that cats might lack a part that humans have which has something to do with consciousness (with the important caveat that “consciousness” is an ill-defined term that probably refers to different things in different contexts), and this probably reduces the amount I care about them, but it still leaves a lot of me-caring-about-them.
So like “humans − 1.5”, “cats − 1.0“, “rocks − 0.0” instead of “1.0, 0.0, 0.0”? Ok then, sounds consistent. Someone might object that we call caring about non-conscious stuff “aesthetic preferences”, but I don’t see how caring about cat’s inner life usually expressed by behaviour is different.