fwiw I do think that it’s a concern. But there is also an anti-inductiveness to close calls, where the more you’ve had in the past the more measures that might have been implemented to avoid future close calls.
So e.g., updating upwards on the Cuba missile crisis doesn’t feel right, because the red phone got implemented after that.
My sense is that anthropic bias isn’t that big a deal here, and IIRC Carl Shulman claimed that he also thought it didn’t matter in a recent Yudkonversation transcript, which basically convinced me. If you think it’s a big deal, I invite you to include it in the calculation.
Doesn’t the anthropic bias impact the calculation, where you take into account not seeing nuclear war before?
fwiw I do think that it’s a concern. But there is also an anti-inductiveness to close calls, where the more you’ve had in the past the more measures that might have been implemented to avoid future close calls.
So e.g., updating upwards on the Cuba missile crisis doesn’t feel right, because the red phone got implemented after that.
My sense is that anthropic bias isn’t that big a deal here, and IIRC Carl Shulman claimed that he also thought it didn’t matter in a recent Yudkonversation transcript, which basically convinced me. If you think it’s a big deal, I invite you to include it in the calculation.