Brook: You know what, I think it’s far more likely that you’re messing with me than you actually shot me. But I’ll concede that it is possible that you did actually shoot me, and the only reason I’m standing here talking to you is because I am forced to take an Everett branch that allows me to be standing here talking to you.
Avery: Well actually, in most of them you end up bleeding out on the floor while you tell me this.
B: And then I die.
A: From my perspective, yeah, most likely. From yours, there will be some branches where a team of paramedics happens to drive by and save you, and if you are to be conscious at all in the future it will be in those branches.
B: Ok, but in most of those I die and simply stop experiencing reality.
A: Maybe. Or maybe you’re guaranteed to take the conscious path, since there must be some future state which has your present as its past.
B: Are you saying that I can’t die? That’s ludicrous!
A: I’m saying that your conscious experience might not ever end, since there’s always a branch where it won’t. And the ones where it does won’t be around to talk about it.
B: So if I make a bomb that is set to blow me up if I don’t win tomorrow’s Powerball jackpot, the next day I’m guaranteed to have the subjective experience of walking away with several hundred million?
A: Well most likely you end up horribly maimed, disfigured, and concussed, unable to do anything until someone takes pity on you, uploads you into a computer, and you live for eternity in some experience we can’t imagine. That‘s where your subjective Everett branch is going to end up regardless , but it’ll be nice to skip the maiming portion.
B: This all seems pretty shakey.
A: Yeah I’m not very confident in that line of reasoning myself. Certainly not better than the 1:292,201,338 powerball odds.
B: You didn’t shoot me, did you?
A: No way! Do you know infinitesimally small the wavelength of a bullet is?
Quantum immortality is a natural extension of the anthropic principle, but I’m far less confident about using it to say anything about future states rather than using it to reason about your current one.
6: When will it end?
Brook: You know what, I think it’s far more likely that you’re messing with me than you actually shot me. But I’ll concede that it is possible that you did actually shoot me, and the only reason I’m standing here talking to you is because I am forced to take an Everett branch that allows me to be standing here talking to you.
Avery: Well actually, in most of them you end up bleeding out on the floor while you tell me this.
B: And then I die.
A: From my perspective, yeah, most likely. From yours, there will be some branches where a team of paramedics happens to drive by and save you, and if you are to be conscious at all in the future it will be in those branches.
B: Ok, but in most of those I die and simply stop experiencing reality.
A: Maybe. Or maybe you’re guaranteed to take the conscious path, since there must be some future state which has your present as its past.
B: Are you saying that I can’t die? That’s ludicrous!
A: I’m saying that your conscious experience might not ever end, since there’s always a branch where it won’t. And the ones where it does won’t be around to talk about it.
B: So if I make a bomb that is set to blow me up if I don’t win tomorrow’s Powerball jackpot, the next day I’m guaranteed to have the subjective experience of walking away with several hundred million?
A: Well most likely you end up horribly maimed, disfigured, and concussed, unable to do anything until someone takes pity on you, uploads you into a computer, and you live for eternity in some experience we can’t imagine. That‘s where your subjective Everett branch is going to end up regardless , but it’ll be nice to skip the maiming portion.
B: This all seems pretty shakey.
A: Yeah I’m not very confident in that line of reasoning myself. Certainly not better than the 1:292,201,338 powerball odds.
B: You didn’t shoot me, did you?
A: No way! Do you know infinitesimally small the wavelength of a bullet is?
Quantum immortality is a natural extension of the anthropic principle, but I’m far less confident about using it to say anything about future states rather than using it to reason about your current one.