This is a thought exercise I came up with on IRC to help with the iffiness of “freezing yourself for a thousand years” with regards to continuity of self.
Let’s say we live in a post-singularity world as uploads and are pretty bored and always up for terrible entertainment (our god is FAI but has a scary sense of humor ..). So some crazy person creates a very peculiar black cube in our shared reality. You walk into it, a fork of you is created and you duke it out via russian roulette. The winner walks out the other side.
Before entering, should you accurately anticipate dying with 50% probability?
I argued that you should anticipate surviving with 100% probability, since the single you that walked out of the box would turn out to be correct in his prediction. Surprisingly, someone disagreed.
So I extended the scenario by another black box with two doors, but this one is just a tunnel. In this case, everybody can agree that you should anticipate a 100% probability of surviving it unscathed. But if we delete our memory of what just happened when exiting the black boxes, and the boxes themselves, then the resulting universes would be indistinguishable!
One easy way to demonstrate this is to chain ten boxes and put a thousand dollars at the end. The person that anticipates dying with 50% probability (so over all the boxes, 1/1024 chance of surviving) would stay well outside. The person that anticipates surviving just walks through and comes away $1000 richer. “But at least my anticipation was correct”, in this scenario, reminds me somewhat of the cries of “but at least my reasoning was correct” on the part of two-boxers.
What I’m wondering is: is there a general rule underlying this about the follies of allowing causally-indistinguishable-in-retrospect effects to differently affect our anticipation? Can somebody formalize this?
Forked Russian Roulette and Anticipation of Survival
This is a thought exercise I came up with on IRC to help with the iffiness of “freezing yourself for a thousand years” with regards to continuity of self.
Let’s say we live in a post-singularity world as uploads and are pretty bored and always up for terrible entertainment (our god is FAI but has a scary sense of humor ..). So some crazy person creates a very peculiar black cube in our shared reality. You walk into it, a fork of you is created and you duke it out via russian roulette. The winner walks out the other side.
Before entering, should you accurately anticipate dying with 50% probability?
I argued that you should anticipate surviving with 100% probability, since the single you that walked out of the box would turn out to be correct in his prediction. Surprisingly, someone disagreed.
So I extended the scenario by another black box with two doors, but this one is just a tunnel. In this case, everybody can agree that you should anticipate a 100% probability of surviving it unscathed. But if we delete our memory of what just happened when exiting the black boxes, and the boxes themselves, then the resulting universes would be indistinguishable!
One easy way to demonstrate this is to chain ten boxes and put a thousand dollars at the end. The person that anticipates dying with 50% probability (so over all the boxes, 1/1024 chance of surviving) would stay well outside. The person that anticipates surviving just walks through and comes away $1000 richer. “But at least my anticipation was correct”, in this scenario, reminds me somewhat of the cries of “but at least my reasoning was correct” on the part of two-boxers.
What I’m wondering is: is there a general rule underlying this about the follies of allowing causally-indistinguishable-in-retrospect effects to differently affect our anticipation? Can somebody formalize this?