What if we survive without building a global utility maximizer god?
I think it’s exceedingly unlikely (<1%) that we robustly prevent anyone from {making an AI that kills everyone} without an aligned sovereign.
What if we survive without building a global utility maximizer god?
I think it’s exceedingly unlikely (<1%) that we robustly prevent anyone from {making an AI that kills everyone} without an aligned sovereign.