Over what timeframe? 2-20% seems a reasonable range to me, and I would not call it “very low”. I’m not sure there is a true consensus, even around the LW frequent posters, but maybe I’m wrong and it is very low in some circles, though it’s not in the group I watch most. It seems plenty high to motivate behaviors or actions you see as influencing it.
My 90⁄10 timeframe for when AGI gets built is 3 years-15 years. And most of my probability mass for PDoom is on the shorter end of that. If we have the current near-human-ish level AI around for another decade, I assume we’ll figure out how to control it.
Over what timeframe? 2-20% seems a reasonable range to me, and I would not call it “very low”. I’m not sure there is a true consensus, even around the LW frequent posters, but maybe I’m wrong and it is very low in some circles, though it’s not in the group I watch most. It seems plenty high to motivate behaviors or actions you see as influencing it.
Agreed. Let’s not lose sight of the fact that 2-20% means it’s still the most important thing in the world, in my view.
My 90⁄10 timeframe for when AGI gets built is 3 years-15 years. And most of my probability mass for PDoom is on the shorter end of that. If we have the current near-human-ish level AI around for another decade, I assume we’ll figure out how to control it.
my p(Doom|AGI after 2040) is <1%