If we believe that morally valuable alien life probably could exist in our future light cone, then an expansionist AI that has no moral value is much worse than blowing ourselves up with nukes.
If we believe that morally valuable alien life probably could exist in our future light cone, then an expansionist AI that has no moral value is much worse than blowing ourselves up with nukes.