“Intuitively, it seems as if 1000 “heads” is “anthropic evidence” for the original coin being “tails”, that the long sequence of “heads” can only be explained by the fact that “tails” would have killed you.”
Can someone explain why this is? My intuitive answer wasn’t the above, because I’d already internalized that, under uncertainty, it’s a fair coin with no future reaching bias. My intuitive feeling was that I’m likely playing against a biased-towards-heads coin. No more, no less, and at most a little information that the original flip possibly came up heads rather than tails.
Well, either way you’re in a vanishingly-unlikely future. I think it’s that we don’t appreciate the unlikelihood of our own existence to some extent—being alive, we expect that event to have been somewhat expected. In the branch where you die with every tails throw, going by quantum immortality thinking, you expect to observe yourself throwing only heads, and you fail to internalize how much this diminishes your measure—ie. you don’t account for your failing branches. In the branch where you may throw tails without dying, the reasoning goes, you would have expected—almost certainly, in fact—to see a fairly even distribution of heads and tails, so the event of seeing no tails feels unlikelier there, despite having the same likelihood.
The notion that you should not anticipate observing outcomes that entail a failure of your ability to observe or remember observing them, while coherent in a deterministic universe, is at least to me, solidly associated with the quantum suicide/immortality thought experiment.
We are indeed in a “vanishingly-unlikely future” and (obviously) if you say what’s P(me existing|no contingencies except the existence of the Universe) it’s so small as to be ridiculous.
I’ve often wondered at this. In my darker moments I’ve thought “what if some not-me who was very like me but more accomplished and rational had existed instead of me?”
If you really want a dark thought, consider the cold war and the retrospective unlikelihood of your existence in that context. Some of the coincidences that prevented the extinction of large parts of the human species look suspiciously similar to the kind of “the gun jammed” events you’d expect in a quantum suicide experiment. And then consider that you should expect this to have been the likeliest history ..
Sorry, that was terribly phrased. I meant death of a large fraction of cultural and social clusters (species in a social/cultural, if not the biological sense). In other words, Western Europe, the US and Russia becoming largely uninhabited via nuclear war, rest to follow depending on how the fallout develops.
“Intuitively, it seems as if 1000 “heads” is “anthropic evidence” for the original coin being “tails”, that the long sequence of “heads” can only be explained by the fact that “tails” would have killed you.”
Can someone explain why this is? My intuitive answer wasn’t the above, because I’d already internalized that, under uncertainty, it’s a fair coin with no future reaching bias. My intuitive feeling was that I’m likely playing against a biased-towards-heads coin. No more, no less, and at most a little information that the original flip possibly came up heads rather than tails.
Well, either way you’re in a vanishingly-unlikely future. I think it’s that we don’t appreciate the unlikelihood of our own existence to some extent—being alive, we expect that event to have been somewhat expected. In the branch where you die with every tails throw, going by quantum immortality thinking, you expect to observe yourself throwing only heads, and you fail to internalize how much this diminishes your measure—ie. you don’t account for your failing branches. In the branch where you may throw tails without dying, the reasoning goes, you would have expected—almost certainly, in fact—to see a fairly even distribution of heads and tails, so the event of seeing no tails feels unlikelier there, despite having the same likelihood.
What’s this about quantum? Coins are pretty well deterministic in the quantum sense.
The notion that you should not anticipate observing outcomes that entail a failure of your ability to observe or remember observing them, while coherent in a deterministic universe, is at least to me, solidly associated with the quantum suicide/immortality thought experiment.
rephrase: yeah, I didn’t think of that.
We are indeed in a “vanishingly-unlikely future” and (obviously) if you say what’s P(me existing|no contingencies except the existence of the Universe) it’s so small as to be ridiculous.
I’ve often wondered at this. In my darker moments I’ve thought “what if some not-me who was very like me but more accomplished and rational had existed instead of me?”
If you really want a dark thought, consider the cold war and the retrospective unlikelihood of your existence in that context. Some of the coincidences that prevented the extinction of large parts of the human species look suspiciously similar to the kind of “the gun jammed” events you’d expect in a quantum suicide experiment. And then consider that you should expect this to have been the likeliest history ..
I’m having trouble making sense of this phrase. Can you describe what the extinction of a large part of a species looks like?
Sorry, that was terribly phrased. I meant death of a large fraction of cultural and social clusters (species in a social/cultural, if not the biological sense). In other words, Western Europe, the US and Russia becoming largely uninhabited via nuclear war, rest to follow depending on how the fallout develops.