OK, makes sense. If we assume the AI to be perfectly rational, it would probably give exterminating humanity out of Earth high priority, exactly because there is a chance of them building another AI.
However, to wipe out humanity from the Earth, the AI does not have to be very smart. One virus, well designed and well distributed, could do the job. An AI with some bugs could still be capable to make it… and then fail to properly arrange the space attack, or destroy itself by wrong self-modification.
OK, makes sense. If we assume the AI to be perfectly rational, it would probably give exterminating humanity out of Earth high priority, exactly because there is a chance of them building another AI.
However, to wipe out humanity from the Earth, the AI does not have to be very smart. One virus, well designed and well distributed, could do the job. An AI with some bugs could still be capable to make it… and then fail to properly arrange the space attack, or destroy itself by wrong self-modification.