Not obviously true. An alternative which immediately comes to my mind is a globally enforced mutual agreement to refrain from building superintelligences.
(Yes, that alternative is unrealistic if making superintelligences turns out to be too easy. But I’d want to see that premise argued for, not passed over in silence.)
Not obviously true. An alternative which immediately comes to my mind is a globally enforced mutual agreement to refrain from building superintelligences.
(Yes, that alternative is unrealistic if making superintelligences turns out to be too easy. But I’d want to see that premise argued for, not passed over in silence.)