Do you really think it is not acceptable to assume that LLMs don’t implement any analogues for that kind of thing though?
Maybe the broader point is that there are many things an embodied organism is doing, and using language is only occasionally one of them. It seems safe to assume that an LLM that is specialized on one thing would not spontaneously implement analogues of all the other things that embodied organisms are doing.
Or do you think that is wrong? Do you, eg., think that an LLM would have to develop simulators for things like the gut in order to do its job better, is that what you are implying? Or am I totally misunderstanding you?
Not exactly. I suppose you could do so.
Do you really think it is not acceptable to assume that LLMs don’t implement any analogues for that kind of thing though?
Maybe the broader point is that there are many things an embodied organism is doing, and using language is only occasionally one of them. It seems safe to assume that an LLM that is specialized on one thing would not spontaneously implement analogues of all the other things that embodied organisms are doing.
Or do you think that is wrong? Do you, eg., think that an LLM would have to develop simulators for things like the gut in order to do its job better, is that what you are implying? Or am I totally misunderstanding you?