There is simply no reason to think that you know what kind of information about the outside world a superintelligence would need to have to escape from its sandbox, and certainly no reason for you to set the bar so conveniently high for your argument.
I listed these as conjectures, and there absolutely is reason to think we can figure out what kinds of information a super-intelligence would need to arrive at the conclusion “I am in a sandbox”.
There are absolute, provable bounds on intelligence. AIXI is the upper limit—the most intelligent thing possible in the universe. But there are things that even AIXI can not possibly know for certain.
You can easily construct toy universes where it is provably impossible that even AIXI could ever escape. The more important question is how that scales up to big interesting universes.
A Mind Prison is certainly possible on at least a small scale, and we have small proofs already. (for example, AIXI can not escape from a pac-man universe. There is simply not enough information in that universe to learn about anything as complex as humans.)
So you have simply assumed apriori that a Mind Prison is impossible, when it fact that is not the case at all.
The stronger conjectures are just that, conjectures.
But consider this: how do you know that you are not in a Mind Prison right now?
I mentioned the Truman Show only to conjure the idea, but its not really that useful on so many levels: a simulation is naturally vastly better—Truman quickly realized that the world was confining him geographically. (its a movie plot and it would be boring if he remained trapped forever)
Thanks, fixed the error.
I listed these as conjectures, and there absolutely is reason to think we can figure out what kinds of information a super-intelligence would need to arrive at the conclusion “I am in a sandbox”.
There are absolute, provable bounds on intelligence. AIXI is the upper limit—the most intelligent thing possible in the universe. But there are things that even AIXI can not possibly know for certain.
You can easily construct toy universes where it is provably impossible that even AIXI could ever escape. The more important question is how that scales up to big interesting universes.
A Mind Prison is certainly possible on at least a small scale, and we have small proofs already. (for example, AIXI can not escape from a pac-man universe. There is simply not enough information in that universe to learn about anything as complex as humans.)
So you have simply assumed apriori that a Mind Prison is impossible, when it fact that is not the case at all.
The stronger conjectures are just that, conjectures.
But consider this: how do you know that you are not in a Mind Prison right now?
I mentioned the Truman Show only to conjure the idea, but its not really that useful on so many levels: a simulation is naturally vastly better—Truman quickly realized that the world was confining him geographically. (its a movie plot and it would be boring if he remained trapped forever)