Less (comparatively) intelligent AGI is probably safer, as it will have a greater incentive to coordinate with humans (over killing us all immediately and starting from scratch), which gives us more time to blackmail them.
Less (comparatively) intelligent AGI is probably safer, as it will have a greater incentive to coordinate with humans (over killing us all immediately and starting from scratch), which gives us more time to blackmail them.