How would you tell? By its behavior: doing something you neither ordered nor wanted.
Think of the present-day “autonomous laborer” with an IQ about 90. The only likely way to lose control of him is for some agitator to instill contrary ideas. Censorship for robots is not so horrible a regime.
Who is it that really wants AGI, absent proof that we need it to automate commodity production?
In my experience, computer systems currently get out of my control by doing exactly what I ordered them to do, which is frequently different than I what I wanted them to do.
Whether or not a system is “just following orders” doesn’t seem to be a good metric for it being under your control.
While I agree that it is out of control if the behavior is neither ordered nor wanted, I think it’s also very possible for the system to get out of control while doing exactly what you ordered it to, but not what you meant for it to.
The argument I’m making is approximately the same as the one we see in the outcome pump example.
This is to say, while a system that is doing something neither ordered nor wanted is definitely out of control, it does not follow that a system that is doing exactly what it was ordered to do is necessarily under your control.
In what sense do you think of an autonomous laborer as being under ‘our control’? How would you tell if it escaped our control?
How would you tell? By its behavior: doing something you neither ordered nor wanted.
Think of the present-day “autonomous laborer” with an IQ about 90. The only likely way to lose control of him is for some agitator to instill contrary ideas. Censorship for robots is not so horrible a regime.
Who is it that really wants AGI, absent proof that we need it to automate commodity production?
In my experience, computer systems currently get out of my control by doing exactly what I ordered them to do, which is frequently different than I what I wanted them to do.
Whether or not a system is “just following orders” doesn’t seem to be a good metric for it being under your control.
How does “just following orders,” a la Nuremberg, bear upon this issue? It’s out of control when its behavior is neither ordered nor wanted.
While I agree that it is out of control if the behavior is neither ordered nor wanted, I think it’s also very possible for the system to get out of control while doing exactly what you ordered it to, but not what you meant for it to.
The argument I’m making is approximately the same as the one we see in the outcome pump example.
This is to say, while a system that is doing something neither ordered nor wanted is definitely out of control, it does not follow that a system that is doing exactly what it was ordered to do is necessarily under your control.
Ideological singulatarians.