I agree with the statement, broadly construed, so I don’t disagree here.
The key disanalogy between climate change and AI risk is the evidence base for both.
For Climate change, there was arguably trillions to quadrillions of data points of evidence, if not more, which is easily enough to convince even very skeptical people’s priors to update massively.
For AI, the evidence base is closer to maybe 100 data points maximum, and arguably lower than that. This is changing for the future, and things are getting better, but it’s quite different from climate change where you could call them deniers pretty matter of factly. This means more general priors matter, and even not very extreme priors wouldn’t update much on the evidence for AI doom, so they are much, much less irrational compared to climate deniers
If the statement is all that’s being asked for, that’s enough. The worry is when people apply climate analogies to the AI without realizing the differences, and those differences are enough to alter or invalidate the conclusions argued for.
I agree with the statement, broadly construed, so I don’t disagree here.
The key disanalogy between climate change and AI risk is the evidence base for both.
For Climate change, there was arguably trillions to quadrillions of data points of evidence, if not more, which is easily enough to convince even very skeptical people’s priors to update massively.
For AI, the evidence base is closer to maybe 100 data points maximum, and arguably lower than that. This is changing for the future, and things are getting better, but it’s quite different from climate change where you could call them deniers pretty matter of factly. This means more general priors matter, and even not very extreme priors wouldn’t update much on the evidence for AI doom, so they are much, much less irrational compared to climate deniers
If the statement is all that’s being asked for, that’s enough. The worry is when people apply climate analogies to the AI without realizing the differences, and those differences are enough to alter or invalidate the conclusions argued for.