The thing is, the correct “expected utility” sum to perform is not really much to do with “valuing Everett branches”. It is to do with what you know—and what you don’t. Some things you don’t know—because of quantum uncertanty. However, other things you don’t know because you never learned about them, other things you don’t know becaue you forgot them, and other things you don’t know because of your delusions. You must calculate the expected consequences of your actions based on your knowledge—and your knowledge of your ignorance. Quantum uncertainty is only a small part of that ignorance—and indeed, it is usually insignificant enough to be totally ignored.
This “valuing Everett branches” material mostly seems like a delusion to me. Human decision theory has precious little to do with the MWI.
The thing is, the correct “expected utility” sum to perform is not really much to do with “valuing Everett branches”. It is to do with what you know—and what you don’t. Some things you don’t know—because of quantum uncertanty. However, other things you don’t know because you never learned about them, other things you don’t know becaue you forgot them, and other things you don’t know because of your delusions. You must calculate the expected consequences of your actions based on your knowledge—and your knowledge of your ignorance. Quantum uncertainty is only a small part of that ignorance—and indeed, it is usually insignificant enough to be totally ignored.
This “valuing Everett branches” material mostly seems like a delusion to me. Human decision theory has precious little to do with the MWI.