He mentioned on some mailing list that he had to think like an AI desperately trying to get out. It makes a world of difference in how you approach the situation if it is your life that is actually on the line.
The role of “the Genie” here and “the AI” in the Boxed AI game have certain obvious similarities. It seems reasonable to assume that a willingness to adopt the former correlates with a willingness to adopt the latter. It seems reasonable to assume that a willingness to adopt the role of “the AI” in the Boxed AI game is necessary (though not sufficient) in order to win that game. So shminux’s claim seems fairly uncontroversial to me. Do you dispute it, or are you merely making a claim about the impossibility of knowledge?
Er.… How do you know? I thought he hadn’t disclosed anything about how he did that.
He mentioned on some mailing list that he had to think like an AI desperately trying to get out. It makes a world of difference in how you approach the situation if it is your life that is actually on the line.
The role of “the Genie” here and “the AI” in the Boxed AI game have certain obvious similarities.
It seems reasonable to assume that a willingness to adopt the former correlates with a willingness to adopt the latter.
It seems reasonable to assume that a willingness to adopt the role of “the AI” in the Boxed AI game is necessary (though not sufficient) in order to win that game.
So shminux’s claim seems fairly uncontroversial to me.
Do you dispute it, or are you merely making a claim about the impossibility of knowledge?
The latter.