Although this is probably true in general, it degrades when trying to get people to do something extremely high-cost like destroy all of humanity. You either need to be very persuasive or trick them about the cost. It’s hard to get people to join ISIS knowing they’re joining ISIS. It’s a lot easier to get them to click on ransomware that can be used to fund ISIS.
You don’t need to tell people “destroy all of humanity” to establish a dictatorship where the AGI is in control of everything and it becomes effectively impossible for individual humans to challenge AGI power.
Helping someone establish a dictatorship is still a high cost action that I think requires being more persuasive than convincing someone to do their job without decisively proving you’re actually their boss.
I think the idea is that the AI doesn’t say “help me establish a dictatorship”. The AI says “I did this one weird trick and made a million dollars, you should try it too!” but surprise, the weird trick is step 1 of 100 to establish The AI World Order.
Or it says: “Policing is very biased against Black people. There should be an impartial AI judge that’s unbiased, so that there aren’t biased court judgements against Black People”
Or it says: “There’s child porn on the computer of person X” [person X being a person that challenges the power of the AI and the AI puts it there]”
Or it says: “We give pay you a good $1,000,000 salary to vote in the board the way we want to convert the top levels of the hierachy of the company into being AGI directed”
There’s no reason why the AGI can’t decisively prove they are the boss. For big corporations being in control of the stock means being the boss who makes the decisions at the top.
A police bureau that switches to using a software that goes out and tell them were to patrol to be better at catching crime doesn’t think they are establishing a dictatorship either.
The idea that an AGI wants to establish a dictatorship can easily be labeled as an irrational conspiracy theory.
You don’t need to be very persuasive to get people to take action in the real world.
Especially right now a lot of people work from home and take their orders from a computer and trust it to give them good orders.
Although this is probably true in general, it degrades when trying to get people to do something extremely high-cost like destroy all of humanity. You either need to be very persuasive or trick them about the cost. It’s hard to get people to join ISIS knowing they’re joining ISIS. It’s a lot easier to get them to click on ransomware that can be used to fund ISIS.
You don’t need to tell people “destroy all of humanity” to establish a dictatorship where the AGI is in control of everything and it becomes effectively impossible for individual humans to challenge AGI power.
Helping someone establish a dictatorship is still a high cost action that I think requires being more persuasive than convincing someone to do their job without decisively proving you’re actually their boss.
I think the idea is that the AI doesn’t say “help me establish a dictatorship”. The AI says “I did this one weird trick and made a million dollars, you should try it too!” but surprise, the weird trick is step 1 of 100 to establish The AI World Order.
Or it says: “Policing is very biased against Black people. There should be an impartial AI judge that’s unbiased, so that there aren’t biased court judgements against Black People”
Or it says: “There’s child porn on the computer of person X” [person X being a person that challenges the power of the AI and the AI puts it there]”
Or it says: “We give pay you a good $1,000,000 salary to vote in the board the way we want to convert the top levels of the hierachy of the company into being AGI directed”
And it does 100,000s of those things in parallel.
There’s no reason why the AGI can’t decisively prove they are the boss. For big corporations being in control of the stock means being the boss who makes the decisions at the top.
A police bureau that switches to using a software that goes out and tell them were to patrol to be better at catching crime doesn’t think they are establishing a dictatorship either.
The idea that an AGI wants to establish a dictatorship can easily be labeled as an irrational conspiracy theory.