ETA: I retract “detachment”. Why you don’ play Russian roulette? Because you could get killed. Why a magician plays Russian roulette? Because he knows he won’t. Someone who doesn’t value Everett branches according to their “reality mass” doesn’t win—no magician would play quantum Russian roulette. That you cannot experience being dead doesn’t mean that you are immortal. (And additionally, my preferences are over worlds, not over experiences.)
The thing is, the correct “expected utility” sum to perform is not really much to do with “valuing Everett branches”. It is to do with what you know—and what you don’t. Some things you don’t know—because of quantum uncertanty. However, other things you don’t know because you never learned about them, other things you don’t know becaue you forgot them, and other things you don’t know because of your delusions. You must calculate the expected consequences of your actions based on your knowledge—and your knowledge of your ignorance. Quantum uncertainty is only a small part of that ignorance—and indeed, it is usually insignificant enough to be totally ignored.
This “valuing Everett branches” material mostly seems like a delusion to me. Human decision theory has precious little to do with the MWI.
Since you are writing below my post and I sense detachment from what I’ve tried to express, I refer you to my http://lesswrong.com/lw/2di/poll_what_value_extra_copies/27ee and http://lesswrong.com/lw/2e0/mwi_copies_and_probability/27f1 comments.
ETA: I retract “detachment”. Why you don’ play Russian roulette? Because you could get killed. Why a magician plays Russian roulette? Because he knows he won’t. Someone who doesn’t value Everett branches according to their “reality mass” doesn’t win—no magician would play quantum Russian roulette. That you cannot experience being dead doesn’t mean that you are immortal. (And additionally, my preferences are over worlds, not over experiences.)
The thing is, the correct “expected utility” sum to perform is not really much to do with “valuing Everett branches”. It is to do with what you know—and what you don’t. Some things you don’t know—because of quantum uncertanty. However, other things you don’t know because you never learned about them, other things you don’t know becaue you forgot them, and other things you don’t know because of your delusions. You must calculate the expected consequences of your actions based on your knowledge—and your knowledge of your ignorance. Quantum uncertainty is only a small part of that ignorance—and indeed, it is usually insignificant enough to be totally ignored.
This “valuing Everett branches” material mostly seems like a delusion to me. Human decision theory has precious little to do with the MWI.