Enough skill in oratory (or something closely related) gets you unboxed. The question is how plausible it is that a superintelligent AI would have enough. (A related question is whether there’s such a thing as enough. There might not be, just as there’s no such thing as enough kinetic energy to let you escape from inside a black hole’s horizon, but the reported results of AI-Box games[1] suggest—though they certainly don’t prove—that there is.)
[1] The term “experiments” seems a little too highfalutin’.
[EDITED to add: I take it Houshalter is saying that Hitler’s known oratorical skills aren’t enough to convince him that H. would have won an AI-Box game, playing as the AI. I am inclined to agree. Hitler was very good at stirring up a crowd, but it’s not clear how that generalizes to persuading an intelligent and skeptical individual.]
Enough skill in oratory (or something closely related) gets you unboxed. The question is how plausible it is that a superintelligent AI would have enough. (A related question is whether there’s such a thing as enough. There might not be, just as there’s no such thing as enough kinetic energy to let you escape from inside a black hole’s horizon, but the reported results of AI-Box games[1] suggest—though they certainly don’t prove—that there is.)
[1] The term “experiments” seems a little too highfalutin’.
[EDITED to add: I take it Houshalter is saying that Hitler’s known oratorical skills aren’t enough to convince him that H. would have won an AI-Box game, playing as the AI. I am inclined to agree. Hitler was very good at stirring up a crowd, but it’s not clear how that generalizes to persuading an intelligent and skeptical individual.]