I suspect there’s a fair amount of handwaving involved, but generally as I understand it, the common belief is that “true” isomorphism does include everything necessary for subjective experience. There’s a TON of weight put on the idea of “relevant pieces” of the brain/CNS, and IMO there’s a serious disconnect among those who claim it’s anywhere near feasible today.
You’ll have to define “conscious” and “subjective experience” more operationally before anyone can even guess what parts of the computation are relevant to those things. It does seem very likely that everything is physics, and a sufficiently accurate simulation will have all the properties of the original. Note the circular definition of “sufficient” there.
Keep in mind that intuition is a bad guide for what a subjective experience even is. We have zero positive or negative experiences outside of the very limited singleton that is ourselves. There’s nothing to have trained or evolved an intuition on.
I think any operational definition of subjective experience would vacuously be preserved by an isomorphism, by definition of an isomorphism. But if your mind ever gets uploaded, you see/remember this conversation, and you feel that you are self-aware in any capacity, that would be a falsification of the claim that mind uploads don’t have subjective experience.
Right, that vacuousness is what I was trying to point out. If there is no consciousness, then “the relevant pieces of the brain/CNS” have not been copied/simulated. It’s a definition of “relevant pieces” and isomorphism, not an empirical question.
And really, until you can prove to yourself that I have (or do not have) subjective experience, and until you can answer the question of whether a goldfish or an LLM has subjective experiences, it’s either meaningless or unknowable. And that’s just whether a simulation HAS subjective experiences. Whether they’re the SAME type of subjective experiences as a given biological source clump of matter is a whole new level of measurements you have to invent.
I suspect there’s a fair amount of handwaving involved, but generally as I understand it, the common belief is that “true” isomorphism does include everything necessary for subjective experience. There’s a TON of weight put on the idea of “relevant pieces” of the brain/CNS, and IMO there’s a serious disconnect among those who claim it’s anywhere near feasible today.
You’ll have to define “conscious” and “subjective experience” more operationally before anyone can even guess what parts of the computation are relevant to those things. It does seem very likely that everything is physics, and a sufficiently accurate simulation will have all the properties of the original. Note the circular definition of “sufficient” there.
Keep in mind that intuition is a bad guide for what a subjective experience even is. We have zero positive or negative experiences outside of the very limited singleton that is ourselves. There’s nothing to have trained or evolved an intuition on.
I think any operational definition of subjective experience would vacuously be preserved by an isomorphism, by definition of an isomorphism. But if your mind ever gets uploaded, you see/remember this conversation, and you feel that you are self-aware in any capacity, that would be a falsification of the claim that mind uploads don’t have subjective experience.
Right, that vacuousness is what I was trying to point out. If there is no consciousness, then “the relevant pieces of the brain/CNS” have not been copied/simulated. It’s a definition of “relevant pieces” and isomorphism, not an empirical question.
And really, until you can prove to yourself that I have (or do not have) subjective experience, and until you can answer the question of whether a goldfish or an LLM has subjective experiences, it’s either meaningless or unknowable. And that’s just whether a simulation HAS subjective experiences. Whether they’re the SAME type of subjective experiences as a given biological source clump of matter is a whole new level of measurements you have to invent.