I, on the other hand, have very little confidence that people trying to build AGI will fail to quickly (within the next 3 years, aka 2027) find ways to do it. I do have confidence that we can politically coordinate to stop the situation becoming an extinction or near-extinction-level catastrophe. So I place much less emphasis on abstaining from publishing ideas which may help both alignment and capabilities, and more emphasis on figuring out ways to generate empirical evidence of the danger before it is too late, so as to facilitate political coordination.
I think that the situation in which humanity fails to politically coordinate to avoid building catastrophically dangerous AI is a situation that leads into conflict, likely a World War III with wide-spread use of nuclear weapons. I don’t expect humanity to go extinct from this and I don’t expect the rogue AGI to emerge as the victor, but I do think it is in everyone’s interests to work hard to avoid such a devastating conflict. I do think that any such conflict would likely wipe out the majority of humanity. That’s a pretty grim risk to be facing on the horizon.
I, on the other hand, have very little confidence that people trying to build AGI will fail to quickly (within the next 3 years, aka 2027) find ways to do it. I do have confidence that we can politically coordinate to stop the situation becoming an extinction or near-extinction-level catastrophe. So I place much less emphasis on abstaining from publishing ideas which may help both alignment and capabilities, and more emphasis on figuring out ways to generate empirical evidence of the danger before it is too late, so as to facilitate political coordination.
I think that the situation in which humanity fails to politically coordinate to avoid building catastrophically dangerous AI is a situation that leads into conflict, likely a World War III with wide-spread use of nuclear weapons. I don’t expect humanity to go extinct from this and I don’t expect the rogue AGI to emerge as the victor, but I do think it is in everyone’s interests to work hard to avoid such a devastating conflict. I do think that any such conflict would likely wipe out the majority of humanity. That’s a pretty grim risk to be facing on the horizon.