I guess “want” in “AI that would want to actually maximize the number of gliders” refers to having a tendency to produce a lot of gliders. If you have an opaque AI with obfuscated and somewhat faulty “jumble of wires” design, you might be unable to locate its world model in any obvious way, but you might be able to characterize its behavior. The point of the example is to challenge the reader to imagine a design of an AI that achieves the tendency of producing gliders in many environments, but isn’t specified in terms of some kind of world model module with glider counting over that world model.
I guess “want” in “AI that would want to actually maximize the number of gliders” refers to having a tendency to produce a lot of gliders. If you have an opaque AI with obfuscated and somewhat faulty “jumble of wires” design, you might be unable to locate its world model in any obvious way, but you might be able to characterize its behavior. The point of the example is to challenge the reader to imagine a design of an AI that achieves the tendency of producing gliders in many environments, but isn’t specified in terms of some kind of world model module with glider counting over that world model.