I value having a future that accords with my preferences. I am in no way indifferent to your tossing a grenade my way, with a subjective 1⁄2 probability of dying. (Or non-subjectively, “forcing half of the future into a state where all my plans, ambitions and expectations come to a grievous end.”)
I am, however, indifferent to your taking an action (creating an “extra” non-interacting copy) which has no influence on what future I will experience.
Or non-subjectively, “forcing half of the future into a state where all my plans, ambitions and expectations come to a grievous end.”
Well, it’s not really half the future. It’s half of the future of this branch, which is itself only an astronomically tiny fraction of the present. The vast majority of the future already contains no Morendil.
I am, however, indifferent to your taking an action (creating an “extra” non-interacting copy) which has no influence on what future I will experience.
So you’d be OK with me putting you to sleep, scanning your brain, creating 1000 copies, then waking them all up and killing all but the original you? (from a selfish point of view, that is—imagine that all the copies are woken then killed instantly and painlessly)
I wouldn’t be happy to experience waking up and realizing that I was a copy about to be snuffed (or even wondering whether I was). So I would prefer not to inflict that on any future selves.
Suppose that the copies to be snuffed out don’t realize it. They just wake up and then die, without ever realizing. Would it worry you that you might “be” one of them?
I value having a future that accords with my preferences. I am in no way indifferent to your tossing a grenade my way, with a subjective 1⁄2 probability of dying. (Or non-subjectively, “forcing half of the future into a state where all my plans, ambitions and expectations come to a grievous end.”)
I am, however, indifferent to your taking an action (creating an “extra” non-interacting copy) which has no influence on what future I will experience.
Well, it’s not really half the future. It’s half of the future of this branch, which is itself only an astronomically tiny fraction of the present. The vast majority of the future already contains no Morendil.
So you’d be OK with me putting you to sleep, scanning your brain, creating 1000 copies, then waking them all up and killing all but the original you? (from a selfish point of view, that is—imagine that all the copies are woken then killed instantly and painlessly)
I wouldn’t be happy to experience waking up and realizing that I was a copy about to be snuffed (or even wondering whether I was). So I would prefer not to inflict that on any future selves.
Suppose that the copies to be snuffed out don’t realize it. They just wake up and then die, without ever realizing. Would it worry you that you might “be” one of them?
It doesn’t really seem to matter, in that case, that you wake them up at all.
And no, I wouldn’t get very worked up about the fate of such patterns (except insofar as I would like them to be preserved for backup purposes).