I think you’re on to something with the “good lies” vs “bad lies” part, but I’m not so sure about your assertion that ChatGPT only looks at how closely the surface level words in the prompt match the subject of interest.
“LLMs are just token prediction engines” is a common, but overly reductionist viewpoint. They commonly reason on levels above basic token matching, and I don’t see much evidence that that’s what’s causing the issue here.
FWIW, I find Jones’s remark plausible, but I agree with you about the reductionist nature of the prediction engine assertion. It’s gotten far too much play in discussions, especially the popular press.
More like people underestimate just how powerful the reductionist view really is. Yes, it all does just in the end reduce it to the basic elements like matrix multiplication or word prediction or basic logic gates/boolean circuits, it’s just that people don’t grasp how far the reductionist paradigm can take you.
I think you’re on to something with the “good lies” vs “bad lies” part, but I’m not so sure about your assertion that ChatGPT only looks at how closely the surface level words in the prompt match the subject of interest.
“LLMs are just token prediction engines” is a common, but overly reductionist viewpoint. They commonly reason on levels above basic token matching, and I don’t see much evidence that that’s what’s causing the issue here.
FWIW, I find Jones’s remark plausible, but I agree with you about the reductionist nature of the prediction engine assertion. It’s gotten far too much play in discussions, especially the popular press.
More like people underestimate just how powerful the reductionist view really is. Yes, it all does just in the end reduce it to the basic elements like matrix multiplication or word prediction or basic logic gates/boolean circuits, it’s just that people don’t grasp how far the reductionist paradigm can take you.
To the extent that the reductionist view leads you to ignore or discount the structure that’s been built up in the model, it takes you too far.