I think most people here envision a full-blown AGI as being in control and not constrained by politics. If a government were to refuse to surrender its nuclear-weapons stockpile, the AGI would tell it “You’re not responsible enough to play with such dangerous toys, silly” and just take it away.
This is what we call ‘superlative futurism’ and is basically theological thinking applied to the topic.
When you assume the superlative, you can handwave such things. Its no different from “god is the greatest thing that can exist” and all the flawed arguments that come from that.
My nightmare was a concept of how things would rationally likely to happen. Not how they ideally would happen. I had envisioned an AGI that was subservient to us and was everything that mankind hopes for. However, I also took into account human sentiment which would not tolerate the AGI simply taking nuclear weapons away, or really the AGI forcing us to do anything.
As soon as the AGI makes any visible move to command and control people the population of the world would scream out about the AGI trying to “enslave” humanity. Efforts to destroy the machine would happen almost instantly.
Human sentiment and politics need always be taken into account.
I had envisioned an AGI that was subservient to us and was everything that mankind hopes for.
What exactly the AGI will be subservient to? A government? Why wouldn’t it tell the AGI to kill all its enemies and make sure no challenges to this government’s power ever arise?
A slave AGI is a weapon and will be used as such.
The common assumption on LW is that after a point the humanity will not be able to directly control an AGI (in the sense of telling it what to do) and will have to, essentially, surrender its sovereignty to it. That is exactly the reason why so much attention is paid to ensuring that this AGI will be “friendly”.
A weapon is no more than a mere tool. It is a thing that when controlled and used properly magnifies the force that the user is capable of. Due to this relationship I point out that an AGI that is subservient to man is not a weapon, for a weapon is a tool with which to do violence, that is physical force upon another. Instead an AGI is a tool that can be transformed into many types of tools. A possible tool that it can be transformed into is in fact a weapon, however as I have pointed out that does not mean that the AGI will always be a weapon.
Power is neither good nor ill. Uncontrolled, uncontested power however is dangerous. Would you start a fire without anything to contain it? For sentient beings we posses social structures, laws and reprisals to manage and regulate the behavior of the powerful force that is man. If even man is managed and controlled and managed into the most intelligent manner we can muster then why would an AGI be free of any such restraint? If sentient being A is due to its power cannot be trusted to operate without rules then how can we trust sentient being B whom is much more powerful to operate without any constraints? Its a logic hole.
A gulf of power is every bit as dangerous. When power between two groups is too disparate then there is a situation of potentially dangerous instability. As such its important for mankind to seek to improve itself so as to shrink this gap between power. Controlling an AGI and using it as a tool to improve man is one potential option to shrink this potential gulf in power.
I think most people here envision a full-blown AGI as being in control and not constrained by politics. If a government were to refuse to surrender its nuclear-weapons stockpile, the AGI would tell it “You’re not responsible enough to play with such dangerous toys, silly” and just take it away.
This is what we call ‘superlative futurism’ and is basically theological thinking applied to the topic.
When you assume the superlative, you can handwave such things. Its no different from “god is the greatest thing that can exist” and all the flawed arguments that come from that.
Well, of course, but this whole field is basically applied theology.
My nightmare was a concept of how things would rationally likely to happen. Not how they ideally would happen. I had envisioned an AGI that was subservient to us and was everything that mankind hopes for. However, I also took into account human sentiment which would not tolerate the AGI simply taking nuclear weapons away, or really the AGI forcing us to do anything.
As soon as the AGI makes any visible move to command and control people the population of the world would scream out about the AGI trying to “enslave” humanity. Efforts to destroy the machine would happen almost instantly.
Human sentiment and politics need always be taken into account.
What exactly the AGI will be subservient to? A government? Why wouldn’t it tell the AGI to kill all its enemies and make sure no challenges to this government’s power ever arise?
A slave AGI is a weapon and will be used as such.
The common assumption on LW is that after a point the humanity will not be able to directly control an AGI (in the sense of telling it what to do) and will have to, essentially, surrender its sovereignty to it. That is exactly the reason why so much attention is paid to ensuring that this AGI will be “friendly”.
A weapon is no more than a mere tool. It is a thing that when controlled and used properly magnifies the force that the user is capable of. Due to this relationship I point out that an AGI that is subservient to man is not a weapon, for a weapon is a tool with which to do violence, that is physical force upon another. Instead an AGI is a tool that can be transformed into many types of tools. A possible tool that it can be transformed into is in fact a weapon, however as I have pointed out that does not mean that the AGI will always be a weapon.
Power is neither good nor ill. Uncontrolled, uncontested power however is dangerous. Would you start a fire without anything to contain it? For sentient beings we posses social structures, laws and reprisals to manage and regulate the behavior of the powerful force that is man. If even man is managed and controlled and managed into the most intelligent manner we can muster then why would an AGI be free of any such restraint? If sentient being A is due to its power cannot be trusted to operate without rules then how can we trust sentient being B whom is much more powerful to operate without any constraints? Its a logic hole.
A gulf of power is every bit as dangerous. When power between two groups is too disparate then there is a situation of potentially dangerous instability. As such its important for mankind to seek to improve itself so as to shrink this gap between power. Controlling an AGI and using it as a tool to improve man is one potential option to shrink this potential gulf in power.