It’s generally hard to find one-size-fits-all responses for things like this. Instead I would first want to know: WHY does he thinks it’s astronomically unlikely to be <80 years away?
Yes, I think this is the most important question. It’s one thing to not be aware of progress in AI and so not have an idea that it might be soon. General resources are fine for updating this sort of mindset.
It’s also a thing to be aware of current progress but think it might or probably will longer. That’s fine, I think it might take longer, and can certainly understand having reasons to believe that it’s less likely than not to happen this century even if I don’t hold them myself.
It’s very different if someone is aware of current developments but has extremely strong views against AGI happening any time this century. Do they actually mean the same thing by “AGI” as most other people do? Do they think it is possible at all?
It’s generally hard to find one-size-fits-all responses for things like this. Instead I would first want to know: WHY does he thinks it’s astronomically unlikely to be <80 years away?
Yes, I think this is the most important question. It’s one thing to not be aware of progress in AI and so not have an idea that it might be soon. General resources are fine for updating this sort of mindset.
It’s also a thing to be aware of current progress but think it might or probably will longer. That’s fine, I think it might take longer, and can certainly understand having reasons to believe that it’s less likely than not to happen this century even if I don’t hold them myself.
It’s very different if someone is aware of current developments but has extremely strong views against AGI happening any time this century. Do they actually mean the same thing by “AGI” as most other people do? Do they think it is possible at all?