On the Cold War thing, I think the lesson to learn depends on whether situations that start with a single nuclear launch reliably (and rapidly) escalate into world-destroying conflicts.
If (nuclear launch) → (rapid extinction), then it seems like the anthropic principle is relevant, and the close calls really might have involved improbable luck.
If, on the other hand, (nuclear launch) → (perhaps lots of death, but usually many survivors), then this suggests the stories of how close the close calls were are exaggerated.
Your ‘if’ statements made me update. I guess there is also a distinction between what conclusions one can draw from this type of anthropic reasoning.
One (maybe naive?) conclusion is that ‘the anthropic principle is protecting us’. If you think the anthropic principle is relevant, then you continue to expect it to allow you to evade extinction.
The other conclusion is that ‘the anthropic perspective is relevant to our past but not our future’. You consider anthropics to be a source of distortion on the historical record, but not a guide to what will happen next. Under this interpretation you would anticipate extinction of [humans / you / other reference class] to be more likely in the future than in the past.
I suspect this split depends on whether you weight your future timelines by how many observers are in them, etc.
On the Cold War thing, I think the lesson to learn depends on whether situations that start with a single nuclear launch reliably (and rapidly) escalate into world-destroying conflicts.
If (nuclear launch) → (rapid extinction), then it seems like the anthropic principle is relevant, and the close calls really might have involved improbable luck.
If, on the other hand, (nuclear launch) → (perhaps lots of death, but usually many survivors), then this suggests the stories of how close the close calls were are exaggerated.
Your ‘if’ statements made me update. I guess there is also a distinction between what conclusions one can draw from this type of anthropic reasoning.
One (maybe naive?) conclusion is that ‘the anthropic principle is protecting us’. If you think the anthropic principle is relevant, then you continue to expect it to allow you to evade extinction.
The other conclusion is that ‘the anthropic perspective is relevant to our past but not our future’. You consider anthropics to be a source of distortion on the historical record, but not a guide to what will happen next. Under this interpretation you would anticipate extinction of [humans / you / other reference class] to be more likely in the future than in the past.
I suspect this split depends on whether you weight your future timelines by how many observers are in them, etc.