AI projects that say they plan to have their AI take over the world could induce serious and harmful conflict
So, is this better or worse than the eternal struggle you propose? Superintelligent agents nuking it out on the planet in a struggle for the future may not be fun—and yet your proposals seem to promote and prolong that stage—rather than getting it over with as quickly as possible. It seems as though your proposal comes off looking much worse in some respects—e.g. if you compare the if you compare total number of casualties. Are you sure that is any better? If so, what makes you think that?
So, is this better or worse than the eternal struggle you propose? Superintelligent agents nuking it out on the planet in a struggle for the future may not be fun—and yet your proposals seem to promote and prolong that stage—rather than getting it over with as quickly as possible. It seems as though your proposal comes off looking much worse in some respects—e.g. if you compare the if you compare total number of casualties. Are you sure that is any better? If so, what makes you think that?