You’re right and I’m being stupid, thanks. But what if you value both weasels (proportional to the probability of survival) and candy bars (proportional to remaining money in case of survival)? Then each bullet destroys a fixed number of weasels and no candy bars, so you should pay 2x more candy bars to remove two bullets instead of one, no?
The bullet does destroy candy bars. Unless you’re introducing some sort of quantum suicide assumption, where you average only over surviving future selves? I suppose then you’re correct: the argument cited by Landsburg fails, because it must be assuming somewhere that your utility function is a probability-weighted sum over future worlds.
You’re right again, thanks again :-) I was indeed using a sort of quantum suicide assumption because I don’t understand why I should care about losing candy bars in the worlds where I’m dead. In such worlds it makes more sense to care only about external goals like saving weasels, or not getting your relatives upset over your premature quantum suicide, etc.
Specifically, I think the middle part of the argument would fail, because you’d go, “eh, if they’re executing half of my future selves, I can only save half the weasels at a given cost in average candy bars, so I’ll spend more of the money on candy bars”.
You’re right and I’m being stupid, thanks. But what if you value both weasels (proportional to the probability of survival) and candy bars (proportional to remaining money in case of survival)? Then each bullet destroys a fixed number of weasels and no candy bars, so you should pay 2x more candy bars to remove two bullets instead of one, no?
The bullet does destroy candy bars. Unless you’re introducing some sort of quantum suicide assumption, where you average only over surviving future selves? I suppose then you’re correct: the argument cited by Landsburg fails, because it must be assuming somewhere that your utility function is a probability-weighted sum over future worlds.
You’re right again, thanks again :-) I was indeed using a sort of quantum suicide assumption because I don’t understand why I should care about losing candy bars in the worlds where I’m dead. In such worlds it makes more sense to care only about external goals like saving weasels, or not getting your relatives upset over your premature quantum suicide, etc.
Specifically, I think the middle part of the argument would fail, because you’d go, “eh, if they’re executing half of my future selves, I can only save half the weasels at a given cost in average candy bars, so I’ll spend more of the money on candy bars”.