I think the point is that not valuing non-interacting copies of oneself might be inconsistent. I suspect it’s true; that consistency requires valuing parallel copies of ourselves just as we value future variants of ourselves and so preserve our lives. Our future selves also can’t “interact” with our current self.
The poll in the previous post had to do with a hypothetical guarantee to create “extra” (non-interacting) copies.
In the situation presented here there is nothing justifying the use of the word “extra”, and it seems analogous to quantum-lottery situations that have been discussed previously. I clearly have a reason to want the world to be such that (assuming MWI) as many of my future selves as possible experience a future that I would want to experience.
As I have argued previously, the term “copy” is misleading anyway, on top of which the word “extra” was reinforcing the connotations linked to copy-as-backup, where in MWI nothing of the sort is happening.
So, I’m still perplexed. Possibly a clack on my part, mind you.
Does it make an important difference to you that the MWI copies were “going to be there anyway”, hence losing them is not foregoing a gain but losing something you already had? Is this an example of loss aversion?
I value having a future that accords with my preferences. I am in no way indifferent to your tossing a grenade my way, with a subjective 1⁄2 probability of dying. (Or non-subjectively, “forcing half of the future into a state where all my plans, ambitions and expectations come to a grievous end.”)
I am, however, indifferent to your taking an action (creating an “extra” non-interacting copy) which has no influence on what future I will experience.
Or non-subjectively, “forcing half of the future into a state where all my plans, ambitions and expectations come to a grievous end.”
Well, it’s not really half the future. It’s half of the future of this branch, which is itself only an astronomically tiny fraction of the present. The vast majority of the future already contains no Morendil.
I am, however, indifferent to your taking an action (creating an “extra” non-interacting copy) which has no influence on what future I will experience.
So you’d be OK with me putting you to sleep, scanning your brain, creating 1000 copies, then waking them all up and killing all but the original you? (from a selfish point of view, that is—imagine that all the copies are woken then killed instantly and painlessly)
I wouldn’t be happy to experience waking up and realizing that I was a copy about to be snuffed (or even wondering whether I was). So I would prefer not to inflict that on any future selves.
Suppose that the copies to be snuffed out don’t realize it. They just wake up and then die, without ever realizing. Would it worry you that you might “be” one of them?
as many of my future selves as possible experience a future that I would want to experience.
Do you mean “as large a fraction of” or “as many as possible in total”? Because if you kill (selectively) most of your future selves, you could end up with the overwhelming majority of those remaining living very well…
As cousin_it has argued, “selectively killing most of my future selves” is something that I subjectively experience as “having a sizeable probability of dying”. That doesn’t appeal.
Ok, understood. So would you say that the fraction of future selves (in some copying process such as MWI) that survive == your subjective probability of survival?
I think the point is that not valuing non-interacting copies of oneself might be inconsistent. I suspect it’s true; that consistency requires valuing parallel copies of ourselves just as we value future variants of ourselves and so preserve our lives. Our future selves also can’t “interact” with our current self.
The poll in the previous post had to do with a hypothetical guarantee to create “extra” (non-interacting) copies.
In the situation presented here there is nothing justifying the use of the word “extra”, and it seems analogous to quantum-lottery situations that have been discussed previously. I clearly have a reason to want the world to be such that (assuming MWI) as many of my future selves as possible experience a future that I would want to experience.
As I have argued previously, the term “copy” is misleading anyway, on top of which the word “extra” was reinforcing the connotations linked to copy-as-backup, where in MWI nothing of the sort is happening.
So, I’m still perplexed. Possibly a clack on my part, mind you.
Does it make an important difference to you that the MWI copies were “going to be there anyway”, hence losing them is not foregoing a gain but losing something you already had? Is this an example of loss aversion?
I value having a future that accords with my preferences. I am in no way indifferent to your tossing a grenade my way, with a subjective 1⁄2 probability of dying. (Or non-subjectively, “forcing half of the future into a state where all my plans, ambitions and expectations come to a grievous end.”)
I am, however, indifferent to your taking an action (creating an “extra” non-interacting copy) which has no influence on what future I will experience.
Well, it’s not really half the future. It’s half of the future of this branch, which is itself only an astronomically tiny fraction of the present. The vast majority of the future already contains no Morendil.
So you’d be OK with me putting you to sleep, scanning your brain, creating 1000 copies, then waking them all up and killing all but the original you? (from a selfish point of view, that is—imagine that all the copies are woken then killed instantly and painlessly)
I wouldn’t be happy to experience waking up and realizing that I was a copy about to be snuffed (or even wondering whether I was). So I would prefer not to inflict that on any future selves.
Suppose that the copies to be snuffed out don’t realize it. They just wake up and then die, without ever realizing. Would it worry you that you might “be” one of them?
It doesn’t really seem to matter, in that case, that you wake them up at all.
And no, I wouldn’t get very worked up about the fate of such patterns (except insofar as I would like them to be preserved for backup purposes).
Do you mean “as large a fraction of” or “as many as possible in total”? Because if you kill (selectively) most of your future selves, you could end up with the overwhelming majority of those remaining living very well…
As cousin_it has argued, “selectively killing most of my future selves” is something that I subjectively experience as “having a sizeable probability of dying”. That doesn’t appeal.
Ok, understood. So would you say that the fraction of future selves (in some copying process such as MWI) that survive == your subjective probability of survival?
Yup.