Of course. If consciousness is computation, then I expect that if my mind’s computation is simulated in a Turing machine, half the times my next experience will be of me inside the machine.
With a little ingenuity, and as long we’re prepared to tolerate ridiculously impractical thought experiments, we could think up a scenario where more and more of your brain’s computational activity is delegated to a computer until the computer is doing all of the work. It doesn’t seem plausible that this would somehow cause your conscious experience to progressively fade away without you noticing.
Then we could imagine repeatedly switching the input/output connections of the simulated brain between your actual body and an ‘avatar’ in a simulated world. It doesn’t seem plausible that this would cause your conscious experience to keep switching on and off without you noticing.
The linked essay is a bit long for me to read right now, but I promise to do so within the weekend.
As to your particular example, the problem is I can also think an even more ridiculously impractical thought experiment: one in which more and more of that computer’s computational activity is in turn delegated to a group of abacus-using monks—and then it doesn’t seem plausible for my conscious experience to keep on persisting, when the monks end up being the ones doing all the work...
It’s the bullet I’m not yet prepared to bite—but if do end up doing so, despite all my intuition telling me no, that’ll be the point where I’ll also have to believe Tegmark IV. P(Tegmark IV|consciousness can persist in the manipulations of abacci)~=99% for me...
Some of the Chalmers’ ideas concerning ‘Fading and dancing qualia’ may be relevant here.
With a little ingenuity, and as long we’re prepared to tolerate ridiculously impractical thought experiments, we could think up a scenario where more and more of your brain’s computational activity is delegated to a computer until the computer is doing all of the work. It doesn’t seem plausible that this would somehow cause your conscious experience to progressively fade away without you noticing.
Then we could imagine repeatedly switching the input/output connections of the simulated brain between your actual body and an ‘avatar’ in a simulated world. It doesn’t seem plausible that this would cause your conscious experience to keep switching on and off without you noticing.
The linked essay is a bit long for me to read right now, but I promise to do so within the weekend.
As to your particular example, the problem is I can also think an even more ridiculously impractical thought experiment: one in which more and more of that computer’s computational activity is in turn delegated to a group of abacus-using monks—and then it doesn’t seem plausible for my conscious experience to keep on persisting, when the monks end up being the ones doing all the work...
It’s the bullet I’m not yet prepared to bite—but if do end up doing so, despite all my intuition telling me no, that’ll be the point where I’ll also have to believe Tegmark IV. P(Tegmark IV|consciousness can persist in the manipulations of abacci)~=99% for me...