The punch line is that the physics we run on gives us a very strong reason to care about the welfare of copies of ourselves, which is (according to my survey) a counter-intuitive result.
No, it doesn’t. Maximising your Everett blob is a different thing than maximising copies. They aren’t the same thing. It is perfectly consistent to care about having yourself existing in as much Everett stuff as possible but be completely indifferent to how many clones you have in any given branch.
Reading down to Wei’s comment and using that break down, premise 3) just seems totally bizarre to me:
We ought to value both kinds of copies the same way.
Huh? What? Why? The only reason to intuitively consider that those must have the same value is to have intuitions that really don’t get quantum mechanics.
I happen to like the idea of having clones. I would pay to have clones across the cosmic horizon. But this is in a whole different league of preference to not having me obliterated from half the quantum tree. So if I was Sly my response would be to lock you in a room and throw your 50% death grenade in with you. Then the Sly from the relevant branch would throw in a frag grenade to finish off the job. You just 50% murdered him.
It occurs to me that my intuitions for such situations are essentially updateless. Wedrifid-Sly cares about the state of the multiverse, not that of the subset of the Everett tree that happens to flow through him at that precise moment in time (timeless too). It is actually extremely difficult for me to imagine thinking in such a way that quantum-murder isn’t just mostly murderring me, even after the event.
I have preferences across the state of the universe and all of my copies share them. Yet I, we, need not value having two copies of us in the universe. It so happens that I do have a mild preference for having such copies and a stronger preference for none of them being tortured but this preference is orthogonal to timeless intuitions.
It so happens that I do have a mild preference for having such copies and a stronger preference for none of them being tortured but this preference is orthogonal to timeless intuitions.
Wanting your identical copies to not be tortured seems to be quintessential timeless decision theory...
Wanting your identical copies to not be tortured seems to be quintessential timeless decision theory...
If that is the case then I reject it timeless decision theory and await a better one. (It isn’t.)
What I want for identical copies is a mere matter of preference. There are many situations, for example, where I would care not at all whether a simulation of me is being tortured and that simulation doesn’t care either. I don’t even consider that to be a particularly insane preference.
No, it doesn’t. Maximising your Everett blob is a different thing than maximising copies. They aren’t the same thing. It is perfectly consistent to care about having yourself existing in as much Everett stuff as possible but be completely indifferent to how many clones you have in any given branch.
Reading down to Wei’s comment and using that break down, premise 3) just seems totally bizarre to me:
Huh? What? Why? The only reason to intuitively consider that those must have the same value is to have intuitions that really don’t get quantum mechanics.
I happen to like the idea of having clones. I would pay to have clones across the cosmic horizon. But this is in a whole different league of preference to not having me obliterated from half the quantum tree. So if I was Sly my response would be to lock you in a room and throw your 50% death grenade in with you. Then the Sly from the relevant branch would throw in a frag grenade to finish off the job. You just 50% murdered him.
It occurs to me that my intuitions for such situations are essentially updateless. Wedrifid-Sly cares about the state of the multiverse, not that of the subset of the Everett tree that happens to flow through him at that precise moment in time (timeless too). It is actually extremely difficult for me to imagine thinking in such a way that quantum-murder isn’t just mostly murderring me, even after the event.
If your intuitions are updateless, you should definitely care about the welfare of copies. Updatelessly, you are a copy of yourself.
I have preferences across the state of the universe and all of my copies share them. Yet I, we, need not value having two copies of us in the universe. It so happens that I do have a mild preference for having such copies and a stronger preference for none of them being tortured but this preference is orthogonal to timeless intuitions.
Wanting your identical copies to not be tortured seems to be quintessential timeless decision theory...
If that is the case then I reject it timeless decision theory and await a better one. (It isn’t.)
What I want for identical copies is a mere matter of preference. There are many situations, for example, where I would care not at all whether a simulation of me is being tortured and that simulation doesn’t care either. I don’t even consider that to be a particularly insane preference.
Do you like being tortured?
No. AND I SAY THE SAME THING AS I PREVIOUSLY DID BUT WITH EMPHASIS. ;)