Basically any plan of the form “use AI to prevent anyone from building more powerful and more dangerous AI” is incredibly power-grabbing by normal standards
I don’t think this is the important point of disagreement. Habryka’s point throughout this thread seems to be that, yes, doing that is power-grabbing, but it is not what MIRI planned to do. So MIRI planned to (intellectually) empower anyone else willing to do (and capable of doing) a pivotal act with a blueprint for how to do so.
So MIRI wasn’t seeking to take power, but rather to allow someone else[1] to do so. It’s the difference between using a weapon and designing a weapon for someone else’s use. An important part is that this “someone else” could very well disagree with MIRI about a large number of things, so there need not be any natural allyship or community or agreement between them.
If you are a blacksmith working in a forge and someone comes into your shop and says “build me a sword so I can use it to kill the king and take control of the realm,” and you agree to do so but do not expect to get anything out-of-the-ordinary in return (in terms of increased power, status, etc), it seems weird and non-central to call your actions power-seeking. You are simply empowering another, different power-seeker. You are not seeking any power of your own.
Who was both in a position of power at an AI lab capable of designing a general intelligence and sufficiently clear-headed about the dangers of powerful AI to understand the need for such a strategy.
I don’t think this is the important point of disagreement. Habryka’s point throughout this thread seems to be that, yes, doing that is power-grabbing, but it is not what MIRI planned to do. So MIRI planned to (intellectually) empower anyone else willing to do (and capable of doing) a pivotal act with a blueprint for how to do so.
So MIRI wasn’t seeking to take power, but rather to allow someone else[1] to do so. It’s the difference between using a weapon and designing a weapon for someone else’s use. An important part is that this “someone else” could very well disagree with MIRI about a large number of things, so there need not be any natural allyship or community or agreement between them.
If you are a blacksmith working in a forge and someone comes into your shop and says “build me a sword so I can use it to kill the king and take control of the realm,” and you agree to do so but do not expect to get anything out-of-the-ordinary in return (in terms of increased power, status, etc), it seems weird and non-central to call your actions power-seeking. You are simply empowering another, different power-seeker. You are not seeking any power of your own.
Who was both in a position of power at an AI lab capable of designing a general intelligence and sufficiently clear-headed about the dangers of powerful AI to understand the need for such a strategy.