I think the standard argument that quantum states are not relevant to cognitive processes is The importance of quantum decoherence in brain processes. This is enough to convince me that going through a classical teleporter or copying machine would preserve my identity, and in the case of a copying machine I would experience an equal subjective probability of coming out as the original or the copy. It also seems to strongly imply than mind uploading into some kind of classical artificial machine is possible, since it’s unlikely that all or even most of the classical properties of the brain are essential. I agree that there’s an open question about whether mind emulation on any arbitrary substrate (like, for instance, software running on CMOS computer chips) preserves identity even if it shows the same behavior as the original.
It also seems to strongly imply than mind uploading into some kind of classical artificial machine is possible, since it’s unlikely that all or even most of the classical properties of the brain are essential.
Could you say more about this? Why is this unlikely?
There seems to generally be a ton of arbitrary path-dependent stuff everywhere in biology that evolution hasn’t yet optimized away, and I don’t see a reason to expect the brain’s implementation of consciousness to be an exception.
Agreed about its implementation of awareness, as opposed to being unaware but still existing. What about its implementation of existing, as opposed to nonexistence?
Based on this comment I guess by “existing” you mean phenomenal consciousness and by “awareness” you mean behavior? I think the set of brainlike things that have the same phenomenal consciousness as me is a subset of the brainlike things that have the same behavior as me.
Well I’d put it the other way round. I don’t know what phenomenal consciousness is unless it just means the bare fact of existence. I currently think the thing people call phenomenal consciousness is just “having realityfluid”.
in the case of a copying machine I would experience an equal subjective probability of coming out as the original or the copy
If you have a copying machine that is capable of outputting more than one (identical) copy, and you do the following:
first, copy yourself once
then, immediately afterwards, take that copy and copy it 9 times (for a total of 1 original and 10 copies)
Do you then expect a uniform 9.09% subjective probability of “coming out” of this process as any of the original + copies, or a 50% chance of coming out as the original and a 5% chance of coming out as any given copy?
If it’s immediate enough that all the copies end up indistinguishable, with the same memories of the copying process, then uniform, otherwise not uniform.
I think the standard argument that quantum states are not relevant to cognitive processes is The importance of quantum decoherence in brain processes. This is enough to convince me that going through a classical teleporter or copying machine would preserve my identity, and in the case of a copying machine I would experience an equal subjective probability of coming out as the original or the copy. It also seems to strongly imply than mind uploading into some kind of classical artificial machine is possible, since it’s unlikely that all or even most of the classical properties of the brain are essential. I agree that there’s an open question about whether mind emulation on any arbitrary substrate (like, for instance, software running on CMOS computer chips) preserves identity even if it shows the same behavior as the original.
Could you say more about this? Why is this unlikely?
There seems to generally be a ton of arbitrary path-dependent stuff everywhere in biology that evolution hasn’t yet optimized away, and I don’t see a reason to expect the brain’s implementation of consciousness to be an exception.
Agreed about its implementation of awareness, as opposed to being unaware but still existing. What about its implementation of existing, as opposed to nonexistence?
Based on this comment I guess by “existing” you mean phenomenal consciousness and by “awareness” you mean behavior? I think the set of brainlike things that have the same phenomenal consciousness as me is a subset of the brainlike things that have the same behavior as me.
Well I’d put it the other way round. I don’t know what phenomenal consciousness is unless it just means the bare fact of existence. I currently think the thing people call phenomenal consciousness is just “having realityfluid”.
If you have a copying machine that is capable of outputting more than one (identical) copy, and you do the following:
first, copy yourself once
then, immediately afterwards, take that copy and copy it 9 times (for a total of 1 original and 10 copies)
Do you then expect a uniform 9.09% subjective probability of “coming out” of this process as any of the original + copies, or a 50% chance of coming out as the original and a 5% chance of coming out as any given copy?
If it’s immediate enough that all the copies end up indistinguishable, with the same memories of the copying process, then uniform, otherwise not uniform.