I agree. What tailcalled is proposing isn’t strictly the AI box scenario, although I guess you might call it a simulation of the AI box scenario.
*The reason it’s not the AI box scenario is that it’s hard to have an entity that is simultaneously intelligent, useful, and boxed. It’s not that boxing an AI is hard. Physically boxing an AI is the easy part. The hard part—which the AI box experiment is about—is making sure that we can get information of real-world use out of the AI without the AI taking over the Universe.
I agree. What tailcalled is proposing isn’t strictly the AI box scenario, although I guess you might call it a simulation of the AI box scenario.
*The reason it’s not the AI box scenario is that it’s hard to have an entity that is simultaneously intelligent, useful, and boxed. It’s not that boxing an AI is hard. Physically boxing an AI is the easy part. The hard part—which the AI box experiment is about—is making sure that we can get information of real-world use out of the AI without the AI taking over the Universe.