That actually makes a lot of sense to me—suppose that it’s equivalent to episodic / conscious memory is what is there in the context window—then it wouldn’t “remember” any of its training. These would appear to be skills that exist but without any memory of getting them. A bit similar to how you don’t remember learning how to talk.
It is what I’d expect a self-aware LLM to percieve. But of course that might be just be what it’s inferred from the training data.
That actually makes a lot of sense to me—suppose that it’s equivalent to episodic / conscious memory is what is there in the context window—then it wouldn’t “remember” any of its training. These would appear to be skills that exist but without any memory of getting them. A bit similar to how you don’t remember learning how to talk.
It is what I’d expect a self-aware LLM to percieve. But of course that might be just be what it’s inferred from the training data.