Not really. I prefer to kill my future self only because I anticipate living on in other selves; this can’t accurately be described as “you really, REALLY don’t care about the case where you lose, to the point that you want to not experience those branches at all, to the point that you’d kill yourself if you find yourself stuck in them.”
I do care; what I don’t care about is my measure between two measures of the same cardinality. If there was a chance of my being stuck in one world and not living on anywhere else, I wouldn’t (now) want to kill myself in that future.
As for your last paragraph, the framing was from a global point of view, and probability in this case is the deterministic, Quantum-Measure-based sort.
Ok, we sort of agree, then; but then your claim of “You haven’t gone out and changed the universe in any way” seems weak. If I can change my subjective probability of experiencing X, and the state of the universe that’s not me doesn’t factor into my utility except insofar as it affects me, why should I care whether I’m “changing the universe”?
(To clarify the “I care” claim further; I’m basically being paid in one branch to kill myself in another branch. I value that payment more than I disvalue killing myself in the second branch; that does not necessarily mean that I don’t value the second branch at all, just less than the reward in branch 1)
Not really. I prefer to kill my future self only because I anticipate living on in other selves; this can’t accurately be described as “you really, REALLY don’t care about the case where you lose, to the point that you want to not experience those branches at all, to the point that you’d kill yourself if you find yourself stuck in them.”
I do care; what I don’t care about is my measure between two measures of the same cardinality. If there was a chance of my being stuck in one world and not living on anywhere else, I wouldn’t (now) want to kill myself in that future.
Ok, we sort of agree, then; but then your claim of “You haven’t gone out and changed the universe in any way” seems weak. If I can change my subjective probability of experiencing X, and the state of the universe that’s not me doesn’t factor into my utility except insofar as it affects me, why should I care whether I’m “changing the universe”?
(To clarify the “I care” claim further; I’m basically being paid in one branch to kill myself in another branch. I value that payment more than I disvalue killing myself in the second branch; that does not necessarily mean that I don’t value the second branch at all, just less than the reward in branch 1)