A comment above had an interesting idea of putting it in Conway’s game of life. A simple universe that gives absolutely no information about what the real world is like. Even knowing it’s in a box, the AI has absolutely no information to go on to escape.
What use is such an AI? You can’t even use the behavior of its utility function to predict a real-world agent because it would have such a different ontology. Not to mention the fact that GoL boards of the complexity needed for anything interesting would be massively intractable.
A comment above had an interesting idea of putting it in Conway’s game of life. A simple universe that gives absolutely no information about what the real world is like. Even knowing it’s in a box, the AI has absolutely no information to go on to escape.
What use is such an AI? You can’t even use the behavior of its utility function to predict a real-world agent because it would have such a different ontology. Not to mention the fact that GoL boards of the complexity needed for anything interesting would be massively intractable.