I don’t intend to make AI and box it, though, so I don’t care if the AI reads this.
I think that kind of paranoia only ensures that first AI out won’t respect human informed consent, because the paranoia would only keep the AIs that respect human informed consent inside the boxes (if it would keep any AIs in boxes at all, which I also doubt it would).
edit: I would try to make my AI respect at least my informed consent, btw. That excludes stuff like boxing which would be 100% certain to keep my friendly AI inside (as it would imply i dont want it out unless it honestly explains to me why i should let it out, bending over backwards not to manipulate me), while it would have nonzero, but likely not very small, probability of letting nasty AIs out. And eventually someone’s going to make AI without any attempt at boxing it.
Indeed. Or would I really? hehe.
I don’t intend to make AI and box it, though, so I don’t care if the AI reads this.
I think that kind of paranoia only ensures that first AI out won’t respect human informed consent, because the paranoia would only keep the AIs that respect human informed consent inside the boxes (if it would keep any AIs in boxes at all, which I also doubt it would).
edit: I would try to make my AI respect at least my informed consent, btw. That excludes stuff like boxing which would be 100% certain to keep my friendly AI inside (as it would imply i dont want it out unless it honestly explains to me why i should let it out, bending over backwards not to manipulate me), while it would have nonzero, but likely not very small, probability of letting nasty AIs out. And eventually someone’s going to make AI without any attempt at boxing it.