I don’t see any reason to privilege the thread of consciousness
I offer you a choice: either you suffer torture for an hour. Or I create a clone of you, torture it for ten hours, and then kill it. In the second option, you are not affected in any way.
From what you’ve said, I gather you’ll choose the first option. You won’t privilege what you actually experience. But… I truly cannot understand why.
It’s often useful to think as if you’re deciding for all copies of yourself. You can maximize each copy’s expected outcome that way in many situations. You can also optimize your expected experience if you don’t know in advance which of ten thousand copies you’ll be. This kind of argument has often been made on LW. Perhaps I mistook some instances of arguments like yours, which truly don’t privilege experience, for this milder version (which I endorse).
You’re begging a very important question when you use “you” to refer only to the template for subsequent duplication. On my wacky view, if you duplicate me perfectly, she’s also me. If it’s time T1 and you’re going to duplicate Alicorn-T1 at T2, then Alicorn-T1 has two futures—Alicorn-T2 and Alicorn-T2\* - and Alicorn-T1 will make advance choices for them both just as if no duplication occurred. If you speak to Alicorn-T1 about the future in which duplication occurs, “you” is plural.
The ‘you’ I used referred only to the pre-duplication person, who is making the choice, and who is singular.
The view you describe is the view I described in the last paragraph of my previous comment (the one you replied to). I understand and agree that you decide things for all identical copies of you who may appear in the next second—because they’re identical, they preserve your decisions. But you can only anticipate experiencing some one thing, not a plurality.
If someone creates a clone (or several) of me, I may not even know about it; and I do not expect to experience anything differently due to the existence of that clone.
If someone destroys my body, I presume I’ll stop experiencing, although I have no idea what that would be like.
By inference, if someone creates a precise clone of me elsewhere and destroys my body at the same moment, I won’t suddenly start experiencing the clone’s life. I.e., I don’t expect to suddenly experience a complete shift in location. Rather, I would experience the same thing (or lack of it) that I would experience if someone killed me without creating a clone.
Yes, this begs the question of why I experience continuity in this body, since physics has no concept of a continuous body. And why do I experience continuity across sleep and unconsciousness? I don’t have an answer, but neither do the alternative you’re proposing. The only real answer I’ve seen is the timeless hypothesis: that at each moment I have a separate moment-experience, which happens to include memories of previous experiences, but they are not necessarily true—they are just the way my brain makes sense of the universe, and it highlights or even invents continuity. But this is too much like the Boltzmann’s Brain conjecture—consistent and with explanatory power, but unsatisfying.
“You” sure seemed like it referred to only one of the postduplication individuals here:
In the second option, you are not affected in any way.
(This when one of me seems to be quite seriously affected, in that you plan to torture and kill that one.)
And here:
You won’t privilege what youactually experience.
(This when I do privilege what I actually experience, and simply think of “I” in these futures as a plural.)
you don’t know in advance which of ten thousand copies you’ll be
(But I’ll be all of them! It’s not as though 9,999 of these people are p-zombies or strangers or even just brand-new genetically identical twins! They’re my futures!)
Er, the second option is the one where I kill the clone. In both options, only you remain alive after 10 hours, no clone.
How about this cleaner version: I create a clone of you (no choice here). Then I torture you for an hour, OR the clone for ten hours, after which you’re both free to go.
How about this: choose between 59 minutes of torture for you and 10 hours for the clone, vs 1 hour for you and the clone, with the experience for both you and the clone being indistinguishable for the first 59 minutes.
If you choose the 59min/10hrs, what’s going through your mind in minute 59 ? Is it “this is all about to stop, but some other poor bastard is going to have a rough 9 hours” ? Or is it “ohgodohgodohgod I hope I’m not the clone” ?
In your new formulation, I’d choose the 1 hour for both of us and we (both copies of me) would both expect it to be over soon at the 59th minute. My copies and I would be in agreement in our expectations of each other’s behavior.
I identify closely with anything sufficiently similar to me—including close past and future versions of me. For instance, if there was a copy of me made an hour ago (whether or not the copy had runtime during that hour), and he or I were given the choice during your test, we would choose the same thing, as mentioned above.
If you choose the 59min/10hrs, what’s going through your mind in minute 59 ? Is it “this is all about to stop, but some other poor bastard is going to have a rough 9 hours” ? Or is it “ohgodohgodohgod I hope I’m not the clone” ?
It’s true that both the original, and the clone, don’t know if they’re the clone or not at minute 59. But the original, who really made the decision before the clone was created, correctly optimized his own future experience to have 59 minutes of torture instead of an hour. The original doesn’t care about the clone’s experiences. (That is, I wouldn’t care.)
I changed my mind since then. So I would make different decisions now… more in line with what others here have been proposing.
I would try to optimize for all projected future clones. But in a scenario where I know some clones are going to die no matter what they do (your previous question), I would partially discount the experiences such clones have before they die and try to optimize more for the experiences of the survivor. That’s just my personal preference: the lifelong memory of the survivor matters more than the precise terminal existence of the killed clone.
Regarding your new questions about anticipation, under the new theory that has no concept of personal continuity, there doesn’t seem to be such a thing as personal survival where duplication&termination are involved.
...I see. That doesn’t change my answer, as it happens; my clone dies, yes, and you bear moral culpability for it, but it is better for one to make it out (relatively) unscathed than for the only survivor to be traumatized. In the new version, I would prefer there be only one hour of torture between us, and accept the first option.
See, the thing is: in my utility function, I don’t have a special ranking for “my” experiences over everyone else’s. When I do the math, I come out paying a lot more attention to my own situation for purely pragmatic reasons.
Even given your theory and your utility function, I don’t see how your clone’s 10 hours of torture and subsequent death would leave you traumatized. Isn’t it best for the survivor to have experienced no torture at all (so we’d torture the clone)?
Also, regarding Scenario B: imagine that I decided to get in on it, only with a variation. Instead of duplicating you and offering those two options—either your original be tortured for an hour or your duplicate be tortured for ten—I created an entirely new individual, no more similar to you than any other human being, and gave you the choice between being tortured yourself for an hour and the new guy being tortured for ten.
Which would you choose? Me, I think it’s perfectly obvious that it is better for less torture to occur.
Better for whom? To me it’s perfectly obvious that it’s better for me to have someone else tortured, and I would choose that. I would only choose to be hurt myself if the tradeoff was very unequal (a speck in the eye for me, ten hours of torture for him), and even then I would soon stop agreeing to be hurt if I had to face such a choice repeatedly.
If someone were to mount a campaign to stop your entire torturing project, and if I could participate by being hurt (but not by endangering my life), then I would agree to pay a much higher price. But that’s because such participation is a form of social capital and also helps enforce social norms elsewhere (this is both an evolutionary and a personal reason).
I’m sorry—in Scenario A (I suffer torture, or duplicate suffers more torture and is killed), I would choose the second option for essentially the reasons you propose. In Scenario B (I’m duplicated, then either I suffer torture or duplicate suffers more torture, then we both live), I would choose the first option because that’s less torture. I don’t see the complexity.
Making sure I was understood correctly: after the clone is killed (which happens in both scenarios), the only identifiable “you” is the survivor. Therefore any considerations of lasting trauma should apply to the survivor, or not all. So to minimize trauma (without minimizing the amount of torture), we should ensure that the survivor—who is known even before the clone is killed—is the one who is not tortured.
I was under the impression that the clone survived in the second scenario—that that was the difference between the two scenarios. This might explain some confusion about my answers, if this was a confusion.
I offer you a choice: either you suffer torture for an hour. Or I create a clone of you, torture it for ten hours, and then kill it. In the second option, you are not affected in any way.
From what you’ve said, I gather you’ll choose the first option. You won’t privilege what you actually experience. But… I truly cannot understand why.
It’s often useful to think as if you’re deciding for all copies of yourself. You can maximize each copy’s expected outcome that way in many situations. You can also optimize your expected experience if you don’t know in advance which of ten thousand copies you’ll be. This kind of argument has often been made on LW. Perhaps I mistook some instances of arguments like yours, which truly don’t privilege experience, for this milder version (which I endorse).
You’re begging a very important question when you use “you” to refer only to the template for subsequent duplication. On my wacky view, if you duplicate me perfectly, she’s also me. If it’s time T1 and you’re going to duplicate Alicorn-T1 at T2, then Alicorn-T1 has two futures—Alicorn-T2 and Alicorn-T2\* - and Alicorn-T1 will make advance choices for them both just as if no duplication occurred. If you speak to Alicorn-T1 about the future in which duplication occurs, “you” is plural.
Your “wacky view” sounds quite similar to mine—I would be interested to read that thesis when it is published.
The ‘you’ I used referred only to the pre-duplication person, who is making the choice, and who is singular.
The view you describe is the view I described in the last paragraph of my previous comment (the one you replied to). I understand and agree that you decide things for all identical copies of you who may appear in the next second—because they’re identical, they preserve your decisions. But you can only anticipate experiencing some one thing, not a plurality.
If someone creates a clone (or several) of me, I may not even know about it; and I do not expect to experience anything differently due to the existence of that clone.
If someone destroys my body, I presume I’ll stop experiencing, although I have no idea what that would be like.
By inference, if someone creates a precise clone of me elsewhere and destroys my body at the same moment, I won’t suddenly start experiencing the clone’s life. I.e., I don’t expect to suddenly experience a complete shift in location. Rather, I would experience the same thing (or lack of it) that I would experience if someone killed me without creating a clone.
Yes, this begs the question of why I experience continuity in this body, since physics has no concept of a continuous body. And why do I experience continuity across sleep and unconsciousness? I don’t have an answer, but neither do the alternative you’re proposing. The only real answer I’ve seen is the timeless hypothesis: that at each moment I have a separate moment-experience, which happens to include memories of previous experiences, but they are not necessarily true—they are just the way my brain makes sense of the universe, and it highlights or even invents continuity. But this is too much like the Boltzmann’s Brain conjecture—consistent and with explanatory power, but unsatisfying.
“You” sure seemed like it referred to only one of the postduplication individuals here:
(This when one of me seems to be quite seriously affected, in that you plan to torture and kill that one.)
And here:
(This when I do privilege what I actually experience, and simply think of “I” in these futures as a plural.)
(But I’ll be all of them! It’s not as though 9,999 of these people are p-zombies or strangers or even just brand-new genetically identical twins! They’re my futures!)
No, I wouldn’t. I’d choose the second option so as to prevent my torture from being compounded with my total death.
Er, the second option is the one where I kill the clone. In both options, only you remain alive after 10 hours, no clone.
How about this cleaner version: I create a clone of you (no choice here). Then I torture you for an hour, OR the clone for ten hours, after which you’re both free to go.
I’d choose one hour I think.
How about this: choose between 59 minutes of torture for you and 10 hours for the clone, vs 1 hour for you and the clone, with the experience for both you and the clone being indistinguishable for the first 59 minutes.
If you choose the 59min/10hrs, what’s going through your mind in minute 59 ? Is it “this is all about to stop, but some other poor bastard is going to have a rough 9 hours” ? Or is it “ohgodohgodohgod I hope I’m not the clone” ?
I’d choose one hour also.
In your new formulation, I’d choose the 1 hour for both of us and we (both copies of me) would both expect it to be over soon at the 59th minute. My copies and I would be in agreement in our expectations of each other’s behavior.
I identify closely with anything sufficiently similar to me—including close past and future versions of me. For instance, if there was a copy of me made an hour ago (whether or not the copy had runtime during that hour), and he or I were given the choice during your test, we would choose the same thing, as mentioned above.
It’s true that both the original, and the clone, don’t know if they’re the clone or not at minute 59. But the original, who really made the decision before the clone was created, correctly optimized his own future experience to have 59 minutes of torture instead of an hour. The original doesn’t care about the clone’s experiences. (That is, I wouldn’t care.)
Sounds consistent. Forgive me if I probe a bit further: I’m not trying to be rude, I’m interested in the boundaries of your theory.
In an unconsciousness—clone—destroy original brain—wake scenario, do you anticipate surviving ?
In an unconsciousness—clone twice—destroy original brain—wake scenario, do you identify with / anticipate being zero, one or two of the clones ?
I changed my mind since then. So I would make different decisions now… more in line with what others here have been proposing.
I would try to optimize for all projected future clones. But in a scenario where I know some clones are going to die no matter what they do (your previous question), I would partially discount the experiences such clones have before they die and try to optimize more for the experiences of the survivor. That’s just my personal preference: the lifelong memory of the survivor matters more than the precise terminal existence of the killed clone.
Regarding your new questions about anticipation, under the new theory that has no concept of personal continuity, there doesn’t seem to be such a thing as personal survival where duplication&termination are involved.
...I see. That doesn’t change my answer, as it happens; my clone dies, yes, and you bear moral culpability for it, but it is better for one to make it out (relatively) unscathed than for the only survivor to be traumatized. In the new version, I would prefer there be only one hour of torture between us, and accept the first option.
See, the thing is: in my utility function, I don’t have a special ranking for “my” experiences over everyone else’s. When I do the math, I come out paying a lot more attention to my own situation for purely pragmatic reasons.
Even given your theory and your utility function, I don’t see how your clone’s 10 hours of torture and subsequent death would leave you traumatized. Isn’t it best for the survivor to have experienced no torture at all (so we’d torture the clone)?
Also, regarding Scenario B: imagine that I decided to get in on it, only with a variation. Instead of duplicating you and offering those two options—either your original be tortured for an hour or your duplicate be tortured for ten—I created an entirely new individual, no more similar to you than any other human being, and gave you the choice between being tortured yourself for an hour and the new guy being tortured for ten.
Which would you choose? Me, I think it’s perfectly obvious that it is better for less torture to occur.
Better for whom? To me it’s perfectly obvious that it’s better for me to have someone else tortured, and I would choose that. I would only choose to be hurt myself if the tradeoff was very unequal (a speck in the eye for me, ten hours of torture for him), and even then I would soon stop agreeing to be hurt if I had to face such a choice repeatedly.
If someone were to mount a campaign to stop your entire torturing project, and if I could participate by being hurt (but not by endangering my life), then I would agree to pay a much higher price. But that’s because such participation is a form of social capital and also helps enforce social norms elsewhere (this is both an evolutionary and a personal reason).
I’m sorry—in Scenario A (I suffer torture, or duplicate suffers more torture and is killed), I would choose the second option for essentially the reasons you propose. In Scenario B (I’m duplicated, then either I suffer torture or duplicate suffers more torture, then we both live), I would choose the first option because that’s less torture. I don’t see the complexity.
Making sure I was understood correctly: after the clone is killed (which happens in both scenarios), the only identifiable “you” is the survivor. Therefore any considerations of lasting trauma should apply to the survivor, or not all. So to minimize trauma (without minimizing the amount of torture), we should ensure that the survivor—who is known even before the clone is killed—is the one who is not tortured.
I was under the impression that the clone survived in the second scenario—that that was the difference between the two scenarios. This might explain some confusion about my answers, if this was a confusion.
No, the scenarios I originally proposed were:
A clone is made.
2a. If you choose, you’re tortured for an hour.
2b. Or if you so choose, the clone is tortured for ten hours.
3, Ten hours from now, the clone is destroyed in any case.