I am, however, indifferent to your taking an action (creating an “extra” non-interacting copy) which has no influence on what future I will experience.
So you’d be OK with me putting you to sleep, scanning your brain, creating 1000 copies, then waking them all up and killing all but the original you? (from a selfish point of view, that is—imagine that all the copies are woken then killed instantly and painlessly)
I wouldn’t be happy to experience waking up and realizing that I was a copy about to be snuffed (or even wondering whether I was). So I would prefer not to inflict that on any future selves.
Suppose that the copies to be snuffed out don’t realize it. They just wake up and then die, without ever realizing. Would it worry you that you might “be” one of them?
So you’d be OK with me putting you to sleep, scanning your brain, creating 1000 copies, then waking them all up and killing all but the original you? (from a selfish point of view, that is—imagine that all the copies are woken then killed instantly and painlessly)
I wouldn’t be happy to experience waking up and realizing that I was a copy about to be snuffed (or even wondering whether I was). So I would prefer not to inflict that on any future selves.
Suppose that the copies to be snuffed out don’t realize it. They just wake up and then die, without ever realizing. Would it worry you that you might “be” one of them?
It doesn’t really seem to matter, in that case, that you wake them up at all.
And no, I wouldn’t get very worked up about the fate of such patterns (except insofar as I would like them to be preserved for backup purposes).