Good discussion. I don’t think anyone (certainly not me) is arguing that consciousness isn’t a physical thing (“real”, in that sense). I’m arguing that “consciousness” may not be a coherent category. In the same sense that long ago, dolphins and whales were considered to be “fish”, but then more fully understood to be marine mammals. Nobody EVER thought they weren’t real. Only that the category was wrong.
Same with the orbiting rock called “pluto”. Nobody sane has claimed it’s not real, it’s just that some believe it’s not a planet. “fish” and “planet” are not real, although every instance of them is real. In fact, many things that are incorrectly thought to be them are real as well. It’s not about “real”, it’s about modeling and categorization.
“Consciousness” is similar—it’s not a real thing, though every instance that’s categorized (and miscategorized) that way is real. There’s no underlying truth or mechanism of resolving the categorization of observable matter as “conscious” or “behavior, but not conscious”—it’s just an agreement among taxonomists.
(note: personally, I find it easiest to categorize most complex behavior in brains as “conscious”—I don’t actually know how it feels to be them, and don’t REALLY know that they self-model in any way I could understand, but it’s a fine simplification to make for my own modeling. I can’t make the claim that this is objectively true, and I can’t even design theoretical tests that would distinguish it from other theories. In this way, it’s similar to MWI vs Copenhagen interpretations of QM—there’s no testable distinction, so use whichever one fits your needs best.)
Yeah, the problem is with the external boundaries and the internal classification of “consciousness”.
I have a first-hand access to my own consciousness. I can assume that other have something similar, because we are biologically similar—but even this kind of reasoning is suspicious, because we already know there are huge difference between people: people in coma are biologically quite similar to people who are awake; there are autists and psychopaths, or people who hallucinate—if there were huge differences in the quality of consciousness, as a result of this, or something else, how would we know it?
And there is the problem with those where we can’t reason by biological similarity: animals, AIs.
Good discussion. I don’t think anyone (certainly not me) is arguing that consciousness isn’t a physical thing (“real”, in that sense). I’m arguing that “consciousness” may not be a coherent category. In the same sense that long ago, dolphins and whales were considered to be “fish”, but then more fully understood to be marine mammals. Nobody EVER thought they weren’t real. Only that the category was wrong.
Same with the orbiting rock called “pluto”. Nobody sane has claimed it’s not real, it’s just that some believe it’s not a planet. “fish” and “planet” are not real, although every instance of them is real. In fact, many things that are incorrectly thought to be them are real as well. It’s not about “real”, it’s about modeling and categorization.
“Consciousness” is similar—it’s not a real thing, though every instance that’s categorized (and miscategorized) that way is real. There’s no underlying truth or mechanism of resolving the categorization of observable matter as “conscious” or “behavior, but not conscious”—it’s just an agreement among taxonomists.
(note: personally, I find it easiest to categorize most complex behavior in brains as “conscious”—I don’t actually know how it feels to be them, and don’t REALLY know that they self-model in any way I could understand, but it’s a fine simplification to make for my own modeling. I can’t make the claim that this is objectively true, and I can’t even design theoretical tests that would distinguish it from other theories. In this way, it’s similar to MWI vs Copenhagen interpretations of QM—there’s no testable distinction, so use whichever one fits your needs best.)
Yeah, the problem is with the external boundaries and the internal classification of “consciousness”.
I have a first-hand access to my own consciousness. I can assume that other have something similar, because we are biologically similar—but even this kind of reasoning is suspicious, because we already know there are huge difference between people: people in coma are biologically quite similar to people who are awake; there are autists and psychopaths, or people who hallucinate—if there were huge differences in the quality of consciousness, as a result of this, or something else, how would we know it?
And there is the problem with those where we can’t reason by biological similarity: animals, AIs.