This fits with my experience talking to people unfamiliar with the field. Many do seem to think it’s closer to GOFAI, explicitly programmed, maybe with a big database of stuff scraped from the internet that gets mixed-and-matched depending on the situation.
Examples include:
Discussions around the affect of AI in the art world often seem to imply that these AIs are taking images directly from the internet and somehow “merging” them together, using a clever (and completely unspecified) algorithm. Sometimes it’s implied or even outright stated that this is just a new way to get around copyright.
Talking about ChatGPT with some friends who have some degree of coding / engineering knowledge, they frequently say things like “it’s not really writing anything, it’s just copied from a database / the internet”.
I’ve also read many news articles and comments which refer to AIs being “programmed”, e.g. “ChatGPT is programmed to avoid violence”, “programmed to understand human language”, etc.
I think most people who have more than a very passing interest in the topic have a better understanding than that though. And I suspect that many completely non-technical people have such a vague understanstanding of what “programmed” means that it could apply to training an LLM or explictly coding an algorithm. But I do think this is a real misunderstanding that is reasonably widespread.
This fits with my experience talking to people unfamiliar with the field. Many do seem to think it’s closer to GOFAI, explicitly programmed, maybe with a big database of stuff scraped from the internet that gets mixed-and-matched depending on the situation.
Examples include:
Discussions around the affect of AI in the art world often seem to imply that these AIs are taking images directly from the internet and somehow “merging” them together, using a clever (and completely unspecified) algorithm. Sometimes it’s implied or even outright stated that this is just a new way to get around copyright.
Talking about ChatGPT with some friends who have some degree of coding / engineering knowledge, they frequently say things like “it’s not really writing anything, it’s just copied from a database / the internet”.
I’ve also read many news articles and comments which refer to AIs being “programmed”, e.g. “ChatGPT is programmed to avoid violence”, “programmed to understand human language”, etc.
I think most people who have more than a very passing interest in the topic have a better understanding than that though. And I suspect that many completely non-technical people have such a vague understanstanding of what “programmed” means that it could apply to training an LLM or explictly coding an algorithm. But I do think this is a real misunderstanding that is reasonably widespread.