I’d totally go for the memory loss/clone destruction option. To me it’s the final outcome that matters most, so if you start with one poor me and end with one rich me without the memory of anything unpleasant, it’s clearly a better option than ending up with one still-pretty-poor me with smarting cheeks. This is, of course, my subjective utility, I have no claim that it is better than anyone else’s for them.
To me it’s the final outcome that matters most … it’s clearly a better option than ending up with one still-pretty-poor me … This is, of course, my subjective utility, I have no claim that it is better than anyone else’s for them.
How could one know with any certainty what’s better for them (in the murkier cases)? Alternatively, if you do have a process that allows you to learn what’s better to you, you should claim that you can also help others to apply that process in order to figure out what’s better to them (which may be a different thing than what the process says about you).
You can of course decide what to do, but having ability to implement your own decisions is separate from having ability to find decisions that are reliably correct, from knowing that the decisions you make are clearly right or pursuing what in fact matters the most.
Does that apply only to copies of you or to people in general? Would you choose to torture all of humanity for a finite time, make them forget it, and then you receive 1 utilon?
Does that apply only to copies of you or to people in general?
As I explained, I do not presume to make decisions for others.
Would you choose to torture all of humanity for a finite time, make them forget it, and then you receive 1 utilon?
I would not, see above. A better question would have been “Would you choose to slightly inconvenience a person you dislike for a short time, make them forget it, and then you receive 3^^^3 utilons?” If I answered “yes” (and I probably would), then you could probe further to see where exactly my self-professed non-interference breaks down. This is the standard way of forking the dust specks-vs-torture boundary and showing the resulting inconsistency.
Similar strategies apply to clarifying other seemingly absolute positions, including yours (“I don’t consider my similarity to a person as a reason to treat them as a redundant copy.”) Presumably at some point the answers become “I don’t know”, rather than Yes/No.
I am fairly certain the only way that I would treat a clone of myself differently than another independent person is if we continued to share internal mental experiences. Then again, I would probably stop thinking of myself and a random person off the street as different people if I started sharing mental experiences with them, too.
In other words, while I would reject sending my fully independent clone to hell in order to gain utility, I might agree to fully share the mental experience with the clone in hell so long as the clone also got to experience the extra utility Omega paid me to balance out hell. That brings up a rather interesting question; if two people share mental experiences do they achieve double the utility of each person individually, or merely the set union of their individual utilities? Or something else?
while I would reject sending my fully independent clone to hell in order to gain utility, I might agree to fully share the mental experience with the clone in hell so long as the clone also got to experience the extra utility Omega paid me to balance out hell.
This seems to contradict your earlier assertion that
the second option the same as Omega offering to clone you, put the clone in hell for a finite amount of time and then destroy it, and give you the money immediately
because if you and the clone are one and the same (no cloning happened, you were tortured and then memory-wiped), “both” of you reap the rewards.
because if you and the clone are one and the same (no cloning happened, you were tortured and then memory-wiped), “both” of you reap the rewards.
We are not the same person after the point of the decision. There’s no continuity of experience. The tortured me experiences none of the utility, and the enriched me experiences none of the torture. That was why I thought of the cloning interpretation to begin with.
I’d totally go for the memory loss/clone destruction option. To me it’s the final outcome that matters most, so if you start with one poor me and end with one rich me without the memory of anything unpleasant, it’s clearly a better option than ending up with one still-pretty-poor me with smarting cheeks. This is, of course, my subjective utility, I have no claim that it is better than anyone else’s for them.
How could one know with any certainty what’s better for them (in the murkier cases)? Alternatively, if you do have a process that allows you to learn what’s better to you, you should claim that you can also help others to apply that process in order to figure out what’s better to them (which may be a different thing than what the process says about you).
You can of course decide what to do, but having ability to implement your own decisions is separate from having ability to find decisions that are reliably correct, from knowing that the decisions you make are clearly right or pursuing what in fact matters the most.
Does that apply only to copies of you or to people in general? Would you choose to torture all of humanity for a finite time, make them forget it, and then you receive 1 utilon?
As I explained, I do not presume to make decisions for others.
I would not, see above. A better question would have been “Would you choose to slightly inconvenience a person you dislike for a short time, make them forget it, and then you receive 3^^^3 utilons?” If I answered “yes” (and I probably would), then you could probe further to see where exactly my self-professed non-interference breaks down. This is the standard way of forking the dust specks-vs-torture boundary and showing the resulting inconsistency.
Similar strategies apply to clarifying other seemingly absolute positions, including yours (“I don’t consider my similarity to a person as a reason to treat them as a redundant copy.”) Presumably at some point the answers become “I don’t know”, rather than Yes/No.
I am fairly certain the only way that I would treat a clone of myself differently than another independent person is if we continued to share internal mental experiences. Then again, I would probably stop thinking of myself and a random person off the street as different people if I started sharing mental experiences with them, too.
In other words, while I would reject sending my fully independent clone to hell in order to gain utility, I might agree to fully share the mental experience with the clone in hell so long as the clone also got to experience the extra utility Omega paid me to balance out hell. That brings up a rather interesting question; if two people share mental experiences do they achieve double the utility of each person individually, or merely the set union of their individual utilities? Or something else?
This seems to contradict your earlier assertion that
because if you and the clone are one and the same (no cloning happened, you were tortured and then memory-wiped), “both” of you reap the rewards.
We are not the same person after the point of the decision. There’s no continuity of experience. The tortured me experiences none of the utility, and the enriched me experiences none of the torture. That was why I thought of the cloning interpretation to begin with.