I could coordinate world superpowers if they wanted to coordinate and were willing to do that. It’s not an intelligence problem, unless the solution is mind-control, and then that’s not a weak pivotal act, it’s an AGI powerful enough to kill you if misaligned.
Mind control is too extreme; I think world superpowers could be coordinated with levels of persuasion greater than one Eliezer but short of mind control. E.g. people are already building narrow persuasion AI capable of generating arguments that are highly persuasive for specific people. A substantially-superhuman but still narrow version of such an AI will very likely be built in the next 5 years, and could be used in a variety of weak pivotal acts (not even in a manipulative way! even a public demonstration of such an AI would make a strong case for coordination, comparable to various weapons treaties).
I could coordinate world superpowers if they wanted to coordinate and were willing to do that. It’s not an intelligence problem, unless the solution is mind-control, and then that’s not a weak pivotal act, it’s an AGI powerful enough to kill you if misaligned.
Mind control is too extreme; I think world superpowers could be coordinated with levels of persuasion greater than one Eliezer but short of mind control. E.g. people are already building narrow persuasion AI capable of generating arguments that are highly persuasive for specific people. A substantially-superhuman but still narrow version of such an AI will very likely be built in the next 5 years, and could be used in a variety of weak pivotal acts (not even in a manipulative way! even a public demonstration of such an AI would make a strong case for coordination, comparable to various weapons treaties).