I don’t think it makes sense to say that the symbol grounding problem has gone away, but I think it does make sense to say that we were wrong about what problems couldn’t be solved without first solving symbol grounding. I also don’t think we’re really that confused about how symbols are grounded(1, 2, 3), although we don’t yet have a clear demonstration of a system that has grounded its symbols in reality. GPTs do seem to be grounded by proxy through their training data, but this gives limited amounts of grounded reasoning today, as you note.
Thanks. My title was a bit tongue in cheek ‘Betteridge’s law’ so yes I agree. I have decided to reply to your post before I have read your references as that may take a while to digest but I plan to do so. I also see you have written stuff on P-Zombies that I was going to write something on. As a relative newcomer its always a balance between just saying something and attempting to read and digest everything about it on LW first.
I don’t think it makes sense to say that the symbol grounding problem has gone away, but I think it does make sense to say that we were wrong about what problems couldn’t be solved without first solving symbol grounding. I also don’t think we’re really that confused about how symbols are grounded(1, 2, 3), although we don’t yet have a clear demonstration of a system that has grounded its symbols in reality. GPTs do seem to be grounded by proxy through their training data, but this gives limited amounts of grounded reasoning today, as you note.
Thanks. My title was a bit tongue in cheek ‘Betteridge’s law’ so yes I agree. I have decided to reply to your post before I have read your references as that may take a while to digest but I plan to do so. I also see you have written stuff on P-Zombies that I was going to write something on. As a relative newcomer its always a balance between just saying something and attempting to read and digest everything about it on LW first.