Many people match “pivotal act” to “deploy AGI to take over the world”, and ignore the underlying problem of preventing others from deploying misaligned AGI.
I have talked to two high-profile alignment/alignment-adjacent people who actively dislike pivotal acts.
I think both have contorted notions of what a pivotal act is about. They focused on how dangerous it would be to let a powerful AI system loose on the world.
However, a pivotal act is about this. So an act that ensures that misaligned AGI will not be built is a pivotal act. Many such acts might look like taking over the world. But this is not a core feature of a pivotal act. If I could prevent all people from deploying misaligned AGI, by eating 10 bananas in sixty seconds, then that would count as a pivotal act!
The two researchers were not talking about how to prevent misaligned AGI from being built at all. So I worry that they are ignoring this problem in their solution proposals. It seems “pivotal act” has become a term with bad connotations. When hearing “pivotal act”, these people pattern match to “deploy AGI to take over the world”, and ignore the underlying problem of preventing others from deploying misaligned AGI.
I expect there are a lot more people who fall into this trap. One of the people was giving a talk and this came up briefly. Other people seemed to be on board with what was said. At least nobody objected, except me.
Many people match “pivotal act” to “deploy AGI to take over the world”, and ignore the underlying problem of preventing others from deploying misaligned AGI.
I have talked to two high-profile alignment/alignment-adjacent people who actively dislike pivotal acts.
I think both have contorted notions of what a pivotal act is about. They focused on how dangerous it would be to let a powerful AI system loose on the world.
However, a pivotal act is about this. So an act that ensures that misaligned AGI will not be built is a pivotal act. Many such acts might look like taking over the world. But this is not a core feature of a pivotal act. If I could prevent all people from deploying misaligned AGI, by eating 10 bananas in sixty seconds, then that would count as a pivotal act!
The two researchers were not talking about how to prevent misaligned AGI from being built at all. So I worry that they are ignoring this problem in their solution proposals. It seems “pivotal act” has become a term with bad connotations. When hearing “pivotal act”, these people pattern match to “deploy AGI to take over the world”, and ignore the underlying problem of preventing others from deploying misaligned AGI.
I expect there are a lot more people who fall into this trap. One of the people was giving a talk and this came up briefly. Other people seemed to be on board with what was said. At least nobody objected, except me.
See also Raemon’s related post.