Ok, perhaps I was too combative with the wording. My general point is: Don’t think of humanity as a coordinated agent, don’t think of “AGI” as a single tribe with particular properties (I frequently see this same mistake with regard to aliens), and in particular, don’t think because a specific AI won’t be able or want to destroy the world, that therefore the world is saved in general.
Ok, perhaps I was too combative with the wording. My general point is: Don’t think of humanity as a coordinated agent, don’t think of “AGI” as a single tribe with particular properties (I frequently see this same mistake with regard to aliens), and in particular, don’t think because a specific AI won’t be able or want to destroy the world, that therefore the world is saved in general.