the UFAI can’t escape from the box and make good on its threat unless the threatened person gives in
How sure are you someone else won’t walk by whose mind it can hack?
Yes—the threat is only credible in proportion to the AI’s chance of escaping and taking over the world without my help.
If I have reason to believe that probability is high then negotiating with the AI could make sense.
How sure are you someone else won’t walk by whose mind it can hack?
Yes—the threat is only credible in proportion to the AI’s chance of escaping and taking over the world without my help.
If I have reason to believe that probability is high then negotiating with the AI could make sense.