I haven’t played this one but would give myself a decent chance of winning, against a Gatekeeper who thinks they could keep a superhuman AI inside a box, if anyone offered me sufficiently huge stakes to make me play the game ever again.
Does this refer to the more difficult version of the AI-Box experiment and what would be sufficiently huge stakes? (Order of magnitude ballpark estimate, not a definite quote.)
Does this refer to the more difficult version of the AI-Box experiment and what would be sufficiently huge stakes? (Order of magnitude ballpark estimate, not a definite quote.)