I just want to throw this in here because it seems a good place: to me it seems that you would want yourself to reason as if only worlds where you survive count, but others would want you to reason as if every world where they survive counts, so the game-theoretic expected outcome is the one where you care about worlds in proportion to people in them with whom you might end up wanting to interact. I think this matches our intuitions reasonably well.
Except for the doomsday device part, but I think evolution can be excused for not adequately preparing us for that one.
PS: there is a wonderfully pithy way of stating quantum immortality in LW terms: “You don’t believe in Quantum Immortality? But after your survival becomes increasingly unlikely all valid future versions of you will come to believe in it. And as we all know, if you know you will be convinced by something might as well believe it now .. ”
I just want to throw this in here because it seems a good place: to me it seems that you would want yourself to reason as if only worlds where you survive count, but others would want you to reason as if every world where they survive counts, so the game-theoretic expected outcome is the one where you care about worlds in proportion to people in them with whom you might end up wanting to interact. I think this matches our intuitions reasonably well.
Except for the doomsday device part, but I think evolution can be excused for not adequately preparing us for that one.
PS: there is a wonderfully pithy way of stating quantum immortality in LW terms: “You don’t believe in Quantum Immortality? But after your survival becomes increasingly unlikely all valid future versions of you will come to believe in it. And as we all know, if you know you will be convinced by something might as well believe it now .. ”