For me, it’s because there’s disjunctively many ways that AGI could not happen (global totalitarian regime, AI winter, 55% CFR avian flu escapes a BSL4 lab, unexpected difficulty building AGI & the planning fallacy on timelines which we totally won’t fall victim to this time...), or that alignment could be solved, or that I could be mistaken about AGI risk being a big deal, or…
Granted, I assign small probabilities to several of these events. But my credence for P(AGI extinction | no more AI alignment work from community) is 70% - much higher than my 40% unconditional credence. I guess that means yes, I think AGI risk is huge (remember that I’m saying “40% chance we just die to AGI, unconditionally”), and that’s after incorporating the significant contributions which I expect the current community to make. The current community is far from sufficient, but it’s also probably picking a good amount of low-hanging fruit, and so I expect that its presence makes a significant difference.
EDIT: I’m decreasing the 70% to 60% to better match my 40% unconditional, because only thecurrent alignment community stops working on alignment.
For me, it’s because there’s disjunctively many ways that AGI could not happen (global totalitarian regime, AI winter, 55% CFR avian flu escapes a BSL4 lab, unexpected difficulty building AGI & the planning fallacy on timelines which we totally won’t fall victim to this time...), or that alignment could be solved, or that I could be mistaken about AGI risk being a big deal, or…
Granted, I assign small probabilities to several of these events. But my credence for P(AGI extinction | no more AI alignment work from community) is 70% - much higher than my 40% unconditional credence. I guess that means yes, I think AGI risk is huge (remember that I’m saying “40% chance we just die to AGI, unconditionally”), and that’s after incorporating the significant contributions which I expect the current community to make. The current community is far from sufficient, but it’s also probably picking a good amount of low-hanging fruit, and so I expect that its presence makes a significant difference.
EDIT: I’m decreasing the 70% to 60% to better match my 40% unconditional, because only the current alignment community stops working on alignment.