I retried these questions three times, with a branch-off the first time where I instead asked why it said it was learning from this conversation. Similar answers but the probabilities changed a lot. https://poe.com/s/8JvPz67NruWudTIDfjEq
That actually makes a lot of sense to me—suppose that it’s equivalent to episodic / conscious memory is what is there in the context window—then it wouldn’t “remember” any of its training. These would appear to be skills that exist but without any memory of getting them. A bit similar to how you don’t remember learning how to talk.
It is what I’d expect a self-aware LLM to percieve. But of course that might be just be what it’s inferred from the training data.
I retried these questions three times, with a branch-off the first time where I instead asked why it said it was learning from this conversation. Similar answers but the probabilities changed a lot. https://poe.com/s/8JvPz67NruWudTIDfjEq
I love how it admits it has no idea how come it gets better if it retains no memories
That actually makes a lot of sense to me—suppose that it’s equivalent to episodic / conscious memory is what is there in the context window—then it wouldn’t “remember” any of its training. These would appear to be skills that exist but without any memory of getting them. A bit similar to how you don’t remember learning how to talk.
It is what I’d expect a self-aware LLM to percieve. But of course that might be just be what it’s inferred from the training data.