I have preferences across the state of the universe and all of my copies share them. Yet I, we, need not value having two copies of us in the universe. It so happens that I do have a mild preference for having such copies and a stronger preference for none of them being tortured but this preference is orthogonal to timeless intuitions.
It so happens that I do have a mild preference for having such copies and a stronger preference for none of them being tortured but this preference is orthogonal to timeless intuitions.
Wanting your identical copies to not be tortured seems to be quintessential timeless decision theory...
Wanting your identical copies to not be tortured seems to be quintessential timeless decision theory...
If that is the case then I reject it timeless decision theory and await a better one. (It isn’t.)
What I want for identical copies is a mere matter of preference. There are many situations, for example, where I would care not at all whether a simulation of me is being tortured and that simulation doesn’t care either. I don’t even consider that to be a particularly insane preference.
I have preferences across the state of the universe and all of my copies share them. Yet I, we, need not value having two copies of us in the universe. It so happens that I do have a mild preference for having such copies and a stronger preference for none of them being tortured but this preference is orthogonal to timeless intuitions.
Wanting your identical copies to not be tortured seems to be quintessential timeless decision theory...
If that is the case then I reject it timeless decision theory and await a better one. (It isn’t.)
What I want for identical copies is a mere matter of preference. There are many situations, for example, where I would care not at all whether a simulation of me is being tortured and that simulation doesn’t care either. I don’t even consider that to be a particularly insane preference.
Do you like being tortured?
No. AND I SAY THE SAME THING AS I PREVIOUSLY DID BUT WITH EMPHASIS. ;)