Related, I have an old post named “Intelligence Explosion vs. Co-operative Explosion”, though it’s more about the argument that AGIs might overpower humanity with a superhuman ability to cooperate even if they can’t become superhumanly intelligent.
Related, I have an old post named “Intelligence Explosion vs. Co-operative Explosion”, though it’s more about the argument that AGIs might overpower humanity with a superhuman ability to cooperate even if they can’t become superhumanly intelligent.