I’d say building an AGI that self-destructs would be pretty good. Especially since up until the point that a minimum breeding population of humans exists, and assuming life is not totally impossible(i.e. the AI hasn’t already deconstructed the earth, or completely poisoned all water and atmosphere), humans could still survive. Making an AGI that doesn’t die would probably not be in our best interests until almost exactly the end.
I’d say building an AGI that self-destructs would be pretty good. Especially since up until the point that a minimum breeding population of humans exists, and assuming life is not totally impossible(i.e. the AI hasn’t already deconstructed the earth, or completely poisoned all water and atmosphere), humans could still survive. Making an AGI that doesn’t die would probably not be in our best interests until almost exactly the end.
Yeah, I assume the case where humans are completely extinct, except in the memory banks of the AI.