For instance if we don’t get AGI by 2030 but lots of people still believe it could happen by 2040, we as a species might be better equipped to form good beliefs on it, figure out who to defer to, etc.
I already think this has happened btw. AI beliefs in 2024 are more sane on average than beliefs in say 2010 IMO.
P.S. I’m not talking about what you personally should do with your time and energy, maybe there’s other projects that appeal to you more. But I think it is worthwhile for someone to be doing the thing I ask. It won’t take much effort.
I think it depends on some factors actually.
For instance if we don’t get AGI by 2030 but lots of people still believe it could happen by 2040, we as a species might be better equipped to form good beliefs on it, figure out who to defer to, etc.
I already think this has happened btw. AI beliefs in 2024 are more sane on average than beliefs in say 2010 IMO.
P.S. I’m not talking about what you personally should do with your time and energy, maybe there’s other projects that appeal to you more. But I think it is worthwhile for someone to be doing the thing I ask. It won’t take much effort.