My read is that he was really trying as hard as he could to not address whether there are apocalyptic risks and instead just focus on encouraging the sorts of policies he thought should be implemented.
Why, though?
Does he know something we don’t? Does he think that if he expresses that those risks are real he’ll lose political capital? People won’t put him or his friends in positions of power, because he’ll be branded as a kook?
Is he just in the habit of side-stepping the weird possibilities?
This looks to me, from the outside, like an unforced error. They were asking the question, about some core beliefs, pretty directly. It seems like it would help if, in every such instance, the EA people who think that the world might be destroyed by AGI in the next 20 years, say that they think that the world might be destroyed by AGI in the next 20 years.
Why, though?
Does he know something we don’t? Does he think that if he expresses that those risks are real he’ll lose political capital? People won’t put him or his friends in positions of power, because he’ll be branded as a kook?
Is he just in the habit of side-stepping the weird possibilities?
This looks to me, from the outside, like an unforced error. They were asking the question, about some core beliefs, pretty directly. It seems like it would help if, in every such instance, the EA people who think that the world might be destroyed by AGI in the next 20 years, say that they think that the world might be destroyed by AGI in the next 20 years.