Does that change if I know the 999 copies in the blue rooms are exactly identical computer simulations (the same code executing the same way)?
My first impulse would be to say that it would make no difference. On the macroscopic level, distinct is distinct, and you can’t make macroscopic objects (like two different brains or CPUs) exactly identical. Even if the software running is exactly the same, it will still experience minor variations in runtime and such.
On the other hand, this is where it gets iffy: we can do crazy things with software that we can’t do with brains. What if the code for Simulated Sleeping Beauty contains parts that split and rejoin threads? The copying thought experiment already violates one of our intuitive assumptions about personal identity: namely, that it’s unique. Interactions between different copies also violate our intuition that it’s independent of other minds.
Yu’el nods. “I can even see some of the troubles myself. Suppose you split brains only a short distance apart from each other, so that they could, in principle, be fused back together again? What if there was an Ebborian with a brain thick enough to be split into a million parts, and the parts could then re-unite? Even if it’s not biologically possible, we could do it with a computer-based mind, someday. Now, suppose you split me into 500,000 brains who woke up in green rooms, and 3 much thicker brains who woke up in red rooms. I would surely anticipate seeing the green room. But most of me who see the green room will see nearly the same thing—different in tiny details, perhaps, enough to differentiate our experience, but such details are soon forgotten. So now suppose that my 500,000 green selves are reunited into one Ebborian, and my 3 red selves are reunited into one Ebborian. Have I just sent nearly all of my “subjective probability” into the green future self, even though it is now only one of two? With only a little more work, you can see how a temporary expenditure of computing power, or a nicely refined brain-splitter and a dose of anesthesia, would let you have a high subjective probability of winning any lottery. At least any lottery that involved splitting you into pieces.”
De’da furrows his eyes. “So have you not just proved your own theory to be nonsense?”
“I’m not sure,” says Yu’el. “At this point, I’m not even sure the conclusion is wrong.”
Yes, the Ebborians were, among other things, one of the inspirations for this post. I just didn’t see these particular thought experiments raised there.
My first impulse would be to say that it would make no difference. On the macroscopic level, distinct is distinct, and you can’t make macroscopic objects (like two different brains or CPUs) exactly identical. Even if the software running is exactly the same, it will still experience minor variations in runtime and such.
On the other hand, this is where it gets iffy: we can do crazy things with software that we can’t do with brains. What if the code for Simulated Sleeping Beauty contains parts that split and rejoin threads? The copying thought experiment already violates one of our intuitive assumptions about personal identity: namely, that it’s unique. Interactions between different copies also violate our intuition that it’s independent of other minds.
This reminds me of the story in Where Physics Meets Experience
Yes, the Ebborians were, among other things, one of the inspirations for this post. I just didn’t see these particular thought experiments raised there.