If you instead mean something like “an inner homunculus reasoning about what to simulate”, then I totally agree that LLMs very likely don’t have this
Yeah, I meant something like this. The reversal curse is evidence because if most output was controlled by “inner beings”, presumably they’d be smart enough to “remember” the reversal.
It’s very strange conclusion. I certainly find easier to recall “word A in foreign language means X” than reversal. If homunculus simulated me (or vast majority of humans), it would create multiple instances of reversal curse.
Distant philosophical example: my brain is smart enough to control my body, but I definitely can’t use its knowledge to create humanoid robots from scratch.
I’m not a simulator enthusiast, but I find your reasoning kinda sloppy.
Yeah, I meant something like this. The reversal curse is evidence because if most output was controlled by “inner beings”, presumably they’d be smart enough to “remember” the reversal.
It’s very strange conclusion. I certainly find easier to recall “word A in foreign language means X” than reversal. If homunculus simulated me (or vast majority of humans), it would create multiple instances of reversal curse.
Distant philosophical example: my brain is smart enough to control my body, but I definitely can’t use its knowledge to create humanoid robots from scratch.
I’m not a simulator enthusiast, but I find your reasoning kinda sloppy.