I don’t even think it needs to be that smart. Humanity has a gun pointed at its collective head already, it just needs to be fired. Using deepfakes to convince an undereducated Russian missile silo crew to fire is probably not that hard. Whether that triggers a full exchange or only causes a societal setback including delaying AGI work might be controllable based on the targets chosen.
The AGI in this scenario won’t want to eliminate humanity as a whole unless they’ve got robotics that are adequate for maintaining and improving infrastructure. But that’s going to be possible very soon.
Yes, to kill everyone AI needs to reach like IQ = 200. Maybe enough to construct a virus.
It is nowhere superintelligence. Superintelligence is overkill.
I don’t even think it needs to be that smart. Humanity has a gun pointed at its collective head already, it just needs to be fired. Using deepfakes to convince an undereducated Russian missile silo crew to fire is probably not that hard. Whether that triggers a full exchange or only causes a societal setback including delaying AGI work might be controllable based on the targets chosen.
The AGI in this scenario won’t want to eliminate humanity as a whole unless they’ve got robotics that are adequate for maintaining and improving infrastructure. But that’s going to be possible very soon.