If you choose the 59min/10hrs, what’s going through your mind in minute 59 ? Is it “this is all about to stop, but some other poor bastard is going to have a rough 9 hours” ? Or is it “ohgodohgodohgod I hope I’m not the clone” ?
It’s true that both the original, and the clone, don’t know if they’re the clone or not at minute 59. But the original, who really made the decision before the clone was created, correctly optimized his own future experience to have 59 minutes of torture instead of an hour. The original doesn’t care about the clone’s experiences. (That is, I wouldn’t care.)
I changed my mind since then. So I would make different decisions now… more in line with what others here have been proposing.
I would try to optimize for all projected future clones. But in a scenario where I know some clones are going to die no matter what they do (your previous question), I would partially discount the experiences such clones have before they die and try to optimize more for the experiences of the survivor. That’s just my personal preference: the lifelong memory of the survivor matters more than the precise terminal existence of the killed clone.
Regarding your new questions about anticipation, under the new theory that has no concept of personal continuity, there doesn’t seem to be such a thing as personal survival where duplication&termination are involved.
It’s true that both the original, and the clone, don’t know if they’re the clone or not at minute 59. But the original, who really made the decision before the clone was created, correctly optimized his own future experience to have 59 minutes of torture instead of an hour. The original doesn’t care about the clone’s experiences. (That is, I wouldn’t care.)
Sounds consistent. Forgive me if I probe a bit further: I’m not trying to be rude, I’m interested in the boundaries of your theory.
In an unconsciousness—clone—destroy original brain—wake scenario, do you anticipate surviving ?
In an unconsciousness—clone twice—destroy original brain—wake scenario, do you identify with / anticipate being zero, one or two of the clones ?
I changed my mind since then. So I would make different decisions now… more in line with what others here have been proposing.
I would try to optimize for all projected future clones. But in a scenario where I know some clones are going to die no matter what they do (your previous question), I would partially discount the experiences such clones have before they die and try to optimize more for the experiences of the survivor. That’s just my personal preference: the lifelong memory of the survivor matters more than the precise terminal existence of the killed clone.
Regarding your new questions about anticipation, under the new theory that has no concept of personal continuity, there doesn’t seem to be such a thing as personal survival where duplication&termination are involved.