feelings didn’t necessarily map to reality, no matter how real they felt
But they do map to reality, just not perfectly. “I see red stripe” approximately maps to some brain activity. Sure, feelings about them being different things may be wrong, but “illusionism about everything except physicalism” is just restating physicalism without any additional argument. So what feelings are you illusionistic about?
Some do happen to map (partially) to reality, but the key here is that there is no obligation for them to do so, and there’s nothing which guarantees that to be true.
In short, what I believe may or may not map to reality at all. Everything we “feel” is a side effect, an emergent behavior of the particles and forces at the bottom. It’s entirely an illusion.
That doesn’t mean there isn’t something going on, and that doesn’t mean that feelings don’t exist, any more than claiming bullets or trees don’t exist. But they’re no more a primitive of the universe than “bullet” or “tree” is: bullets and trees are loose collections of particles and forces we’ve decided to slap labels on; feelings, opinions, and ideas are more conceptual, but still end up being represented and generated by particles and forces. They’re no more real than trees or bullets are, though the labels are useful to us.
If you don’t have philosophical issues with trees, you shouldn’t have them with consciousness.
I appreciate the difference between absolute certainty and allowing the possibility of error, but as a matter of terminology, “illusion” is usually used to refer to things that are wrong, not merely may be wrong. Words doesn’t matter that much, of course, but I still interested in what intuitions about consciousness you consider to probably not correspond to reality at all? For example, what do you do with intuition underlying zombie argument:
Would you say the statement “we live in non-zombie world” is true?
Or the entire setup is contradictory because consciousness is a label for some arbitrary structure/algorithm and it was specified that structures match for both worlds?
Or do you completely throw away the intuition about consciousness as not useful?
From what you said I guess it’s 2 (which by the way implies that whether you/you from yesterday/LUT-you/dust-theoretic copies of you/dogs feel pain is a matter of preferences), so the next question is what evidence is there for the conclusion that the intuition about consciousness can’t map to anything other than algorithm in the brain? It can’t map to something magical but what if there is some part of reality that this intuition corresponds to?
When I first heard about p-zombies 10+ years ago, I thought the idea was stupid. I still think the idea is stupid. Depending on how you define the words, we could all be p-zombies; or we could all not be p-zombies. Regardless of how we define the words though, we’re large collections of particles and forces operating on very simple rules, and when you look at it from that standpoint, the question dissolves.
Basically yes: the p-zombie thought experiment is broken because it hinges on label definitions and ignores the fact that we have observation and evidence we can fall back on for a much, much more accurate picture (which isn’t well represented by any single word in our language.)
Intuition about consciousness is useful in the same way that intuition about quantum mechanics and general relativity is useful: for most people, basically not at all, or only in very limited regimes. Keep in mind that human intuition is no more complicated than a trained neural net trying to make a prediction about something. It can be close to right, it can be mostly wrong, it can be entirely wrong. Most people have good intuition/prediction about whether the sun will rise tomorrow; most people have bad intuition/prediction about how two charged particles in a square potential well will behave. And IMO, most people have a mistaken intuition/prediction that consciousness is somehow real and supernatural and beyond what physics can tell us. Those people would be in the ‘wrong’ bucket.
Regarding “what evidence is there for the conclusion that the intuition about consciousness can’t map to anything other than algorithm in the brain?”: I would posit for evidence the fact that people have intuitions about all kinds of completely ridiculous and crazy stuff that doesn’t make sense. I can see no reason why intuition about consciousness must somehow always be coherent, when so many people have intuitions that don’t even remotely match reality (and/or are inconsistent with each other or themselves.)
Regarding “what if there is some part of reality that this intuition corresponds to?”: I don’t understand what you’re trying to drive at here. We call intuitions like that “testable”, and upon passing those tests, we call them “likely to model or represent reality in some way”.
Hmm, I’m not actually sure about quantifying ratio of crazy/predictive intuitions (especially in case of generalizing to include perception) to arrive at low prior for intuitions. The way I see it, if everyone had an interactive map of Haiti in the corner of their vision, we should try to understand how it works and find what it corresponds to in reality—not immediately dismiss it. Hence the question about specific illusionary parts of consciousness.
Anyway, I thing the intuition about consciousness does correspond to a part of reality—to “reality” part. I.e. panpsychism is true and zombie thought experiment illustrates difference between real world and the world that does not exist. It doesn’t involve additional primitives, because physical theories already include reality, and it diverges from intuition about consciousness in unsurprising parts (like intuition being too anthropocentric).
But they do map to reality, just not perfectly. “I see red stripe” approximately maps to some brain activity. Sure, feelings about them being different things may be wrong, but “illusionism about everything except physicalism” is just restating physicalism without any additional argument. So what feelings are you illusionistic about?
All of them.
Some do happen to map (partially) to reality, but the key here is that there is no obligation for them to do so, and there’s nothing which guarantees that to be true.
In short, what I believe may or may not map to reality at all. Everything we “feel” is a side effect, an emergent behavior of the particles and forces at the bottom. It’s entirely an illusion.
That doesn’t mean there isn’t something going on, and that doesn’t mean that feelings don’t exist, any more than claiming bullets or trees don’t exist. But they’re no more a primitive of the universe than “bullet” or “tree” is: bullets and trees are loose collections of particles and forces we’ve decided to slap labels on; feelings, opinions, and ideas are more conceptual, but still end up being represented and generated by particles and forces. They’re no more real than trees or bullets are, though the labels are useful to us.
If you don’t have philosophical issues with trees, you shouldn’t have them with consciousness.
I appreciate the difference between absolute certainty and allowing the possibility of error, but as a matter of terminology, “illusion” is usually used to refer to things that are wrong, not merely may be wrong. Words doesn’t matter that much, of course, but I still interested in what intuitions about consciousness you consider to probably not correspond to reality at all? For example, what do you do with intuition underlying zombie argument:
Would you say the statement “we live in non-zombie world” is true?
Or the entire setup is contradictory because consciousness is a label for some arbitrary structure/algorithm and it was specified that structures match for both worlds?
Or do you completely throw away the intuition about consciousness as not useful?
From what you said I guess it’s 2 (which by the way implies that whether you/you from yesterday/LUT-you/dust-theoretic copies of you/dogs feel pain is a matter of preferences), so the next question is what evidence is there for the conclusion that the intuition about consciousness can’t map to anything other than algorithm in the brain? It can’t map to something magical but what if there is some part of reality that this intuition corresponds to?
When I first heard about p-zombies 10+ years ago, I thought the idea was stupid. I still think the idea is stupid. Depending on how you define the words, we could all be p-zombies; or we could all not be p-zombies. Regardless of how we define the words though, we’re large collections of particles and forces operating on very simple rules, and when you look at it from that standpoint, the question dissolves.
Basically yes: the p-zombie thought experiment is broken because it hinges on label definitions and ignores the fact that we have observation and evidence we can fall back on for a much, much more accurate picture (which isn’t well represented by any single word in our language.)
Intuition about consciousness is useful in the same way that intuition about quantum mechanics and general relativity is useful: for most people, basically not at all, or only in very limited regimes. Keep in mind that human intuition is no more complicated than a trained neural net trying to make a prediction about something. It can be close to right, it can be mostly wrong, it can be entirely wrong. Most people have good intuition/prediction about whether the sun will rise tomorrow; most people have bad intuition/prediction about how two charged particles in a square potential well will behave. And IMO, most people have a mistaken intuition/prediction that consciousness is somehow real and supernatural and beyond what physics can tell us. Those people would be in the ‘wrong’ bucket.
Regarding “what evidence is there for the conclusion that the intuition about consciousness can’t map to anything other than algorithm in the brain?”: I would posit for evidence the fact that people have intuitions about all kinds of completely ridiculous and crazy stuff that doesn’t make sense. I can see no reason why intuition about consciousness must somehow always be coherent, when so many people have intuitions that don’t even remotely match reality (and/or are inconsistent with each other or themselves.)
Regarding “what if there is some part of reality that this intuition corresponds to?”: I don’t understand what you’re trying to drive at here. We call intuitions like that “testable”, and upon passing those tests, we call them “likely to model or represent reality in some way”.
Hmm, I’m not actually sure about quantifying ratio of crazy/predictive intuitions (especially in case of generalizing to include perception) to arrive at low prior for intuitions. The way I see it, if everyone had an interactive map of Haiti in the corner of their vision, we should try to understand how it works and find what it corresponds to in reality—not immediately dismiss it. Hence the question about specific illusionary parts of consciousness.
Anyway, I thing the intuition about consciousness does correspond to a part of reality—to “reality” part. I.e. panpsychism is true and zombie thought experiment illustrates difference between real world and the world that does not exist. It doesn’t involve additional primitives, because physical theories already include reality, and it diverges from intuition about consciousness in unsurprising parts (like intuition being too anthropocentric).