I see. I agree that it might be a failure of imagination, but if it is, why do you consider that way more likely than the alternative “it is not that easy to do something like that even being very clever”? The problem I have is that all doom scenarios that I see discussed are so utterly unrealistic (e.g. the AGI suddenly makes nanobots and delivers it to all humans at once and so on) that it makes me think that the fact we are failing at conceiving plans that could succeed is because it might be harder than we think.
I see. I agree that it might be a failure of imagination, but if it is, why do you consider that way more likely than the alternative “it is not that easy to do something like that even being very clever”? The problem I have is that all doom scenarios that I see discussed are so utterly unrealistic (e.g. the AGI suddenly makes nanobots and delivers it to all humans at once and so on) that it makes me think that the fact we are failing at conceiving plans that could succeed is because it might be harder than we think.