I think you’re making a good point (rationalists maybe don’t weight other opinions highly enough), but you’d get farther framing it as an update to how to use Bayesian reasoning, rather than an alternative. Bayesian reasoning has a pretty strong intuitive connection to “the factually correct way to reason”, even though there’s a ton of subtlety in that statement and how and where it’s applied.
WRT to many of your arguments: base rates are increasingly just the wrong way to reason about AGI risks. We can think in more detail about how we’ll build AGI and what the risks are.
I think you’re making a good point (rationalists maybe don’t weight other opinions highly enough), but you’d get farther framing it as an update to how to use Bayesian reasoning, rather than an alternative. Bayesian reasoning has a pretty strong intuitive connection to “the factually correct way to reason”, even though there’s a ton of subtlety in that statement and how and where it’s applied.
WRT to many of your arguments: base rates are increasingly just the wrong way to reason about AGI risks. We can think in more detail about how we’ll build AGI and what the risks are.