Does that apply only to copies of you or to people in general? Would you choose to torture all of humanity for a finite time, make them forget it, and then you receive 1 utilon?
Does that apply only to copies of you or to people in general?
As I explained, I do not presume to make decisions for others.
Would you choose to torture all of humanity for a finite time, make them forget it, and then you receive 1 utilon?
I would not, see above. A better question would have been “Would you choose to slightly inconvenience a person you dislike for a short time, make them forget it, and then you receive 3^^^3 utilons?” If I answered “yes” (and I probably would), then you could probe further to see where exactly my self-professed non-interference breaks down. This is the standard way of forking the dust specks-vs-torture boundary and showing the resulting inconsistency.
Similar strategies apply to clarifying other seemingly absolute positions, including yours (“I don’t consider my similarity to a person as a reason to treat them as a redundant copy.”) Presumably at some point the answers become “I don’t know”, rather than Yes/No.
I am fairly certain the only way that I would treat a clone of myself differently than another independent person is if we continued to share internal mental experiences. Then again, I would probably stop thinking of myself and a random person off the street as different people if I started sharing mental experiences with them, too.
In other words, while I would reject sending my fully independent clone to hell in order to gain utility, I might agree to fully share the mental experience with the clone in hell so long as the clone also got to experience the extra utility Omega paid me to balance out hell. That brings up a rather interesting question; if two people share mental experiences do they achieve double the utility of each person individually, or merely the set union of their individual utilities? Or something else?
while I would reject sending my fully independent clone to hell in order to gain utility, I might agree to fully share the mental experience with the clone in hell so long as the clone also got to experience the extra utility Omega paid me to balance out hell.
This seems to contradict your earlier assertion that
the second option the same as Omega offering to clone you, put the clone in hell for a finite amount of time and then destroy it, and give you the money immediately
because if you and the clone are one and the same (no cloning happened, you were tortured and then memory-wiped), “both” of you reap the rewards.
because if you and the clone are one and the same (no cloning happened, you were tortured and then memory-wiped), “both” of you reap the rewards.
We are not the same person after the point of the decision. There’s no continuity of experience. The tortured me experiences none of the utility, and the enriched me experiences none of the torture. That was why I thought of the cloning interpretation to begin with.
Does that apply only to copies of you or to people in general? Would you choose to torture all of humanity for a finite time, make them forget it, and then you receive 1 utilon?
As I explained, I do not presume to make decisions for others.
I would not, see above. A better question would have been “Would you choose to slightly inconvenience a person you dislike for a short time, make them forget it, and then you receive 3^^^3 utilons?” If I answered “yes” (and I probably would), then you could probe further to see where exactly my self-professed non-interference breaks down. This is the standard way of forking the dust specks-vs-torture boundary and showing the resulting inconsistency.
Similar strategies apply to clarifying other seemingly absolute positions, including yours (“I don’t consider my similarity to a person as a reason to treat them as a redundant copy.”) Presumably at some point the answers become “I don’t know”, rather than Yes/No.
I am fairly certain the only way that I would treat a clone of myself differently than another independent person is if we continued to share internal mental experiences. Then again, I would probably stop thinking of myself and a random person off the street as different people if I started sharing mental experiences with them, too.
In other words, while I would reject sending my fully independent clone to hell in order to gain utility, I might agree to fully share the mental experience with the clone in hell so long as the clone also got to experience the extra utility Omega paid me to balance out hell. That brings up a rather interesting question; if two people share mental experiences do they achieve double the utility of each person individually, or merely the set union of their individual utilities? Or something else?
This seems to contradict your earlier assertion that
because if you and the clone are one and the same (no cloning happened, you were tortured and then memory-wiped), “both” of you reap the rewards.
We are not the same person after the point of the decision. There’s no continuity of experience. The tortured me experiences none of the utility, and the enriched me experiences none of the torture. That was why I thought of the cloning interpretation to begin with.