Okay, that clarifies a lot. But the last paragraph I find surprising.
re: (2), I just don’t see LLMs as providing much evidence yet about whether the concepts they’re picking up are compact or correct (cf. monkeys don’t have an IGF concept).
If LLMs are good at understanding the meaning of human text, they must to be good at understanding human concepts, since concepts are just meanings of words the LLM understands. Do you doubt they are really understanding text as well as it seems? Or do you mean they are picking up other, non-human, concepts as well, and this is a problem?
Regarding monkeys, they apparently don’t understand the IGF concept as they are not good enough at reasoning abstractly about evolution and unobservable entities (genes), and they lack the empirical knowledge like humans until recently. I’m not sure how that would be an argument against advanced LLMs grasping the concepts they seem to grasp.
Okay, that clarifies a lot. But the last paragraph I find surprising.
If LLMs are good at understanding the meaning of human text, they must to be good at understanding human concepts, since concepts are just meanings of words the LLM understands. Do you doubt they are really understanding text as well as it seems? Or do you mean they are picking up other, non-human, concepts as well, and this is a problem?
Regarding monkeys, they apparently don’t understand the IGF concept as they are not good enough at reasoning abstractly about evolution and unobservable entities (genes), and they lack the empirical knowledge like humans until recently. I’m not sure how that would be an argument against advanced LLMs grasping the concepts they seem to grasp.