I don’t think I’m doing so? I think it could be safely assumed that people have an idea of “software”, and that they know that AI is a type of software. Other than that, I’m largely assuming that they have no specific beliefs about how AI works, a blank map. Which, however, means that when they think of AI, they think about generic “software”, thereby importing their ideas about how software works to AI. And those ideas include “people wrote it”, which is causing the misconception I suspect them to have.
I think it could be safely assumed that people have an idea of “software”, and that they know that AI is a type of software.
I second faul_sname’s point. I have a relative whose business involves editing other people’s photos as a key step. It’s amazing how often she comes across customers who have no idea what a file is, let alone how to attach one to an email. These are sometimes people who have already sent her emails with files attached in the past. Then add all the people who can’t comprehend, after multiple rounds of explanation, that there is a difference between a photo file, like a jpeg, as opposed to a screenshot of their desktop or phone when the photo is pulled up. Yet somehow they know how to take a screenshot and send it to her.
For so many people, their computer (or car, or microwave, etc.) really is just a magic black box anyway, and if it breaks you go to a more powerful wizard to re-enchant it. The idea that it has parts and you can understand how they work is just… not part of the mental process.
I think it could be safely assumed that people have an idea of “software”
Speaking as a software developer who interacts with end-users sometimes, I think you might be surprised at what the mental model of typical software users, rather than developers, looks like. When people who have programmed, or who work a lot with computers, think of “software”, we think of systems which do exactly what we tell them to do, whether or not that is what we meant. However, the world of modern software does its best to hide the sharp edges from users, and the culture of constant A/B tests means that software doesn’t particularly behave the same way day-in and day-out from the perspective of end-users. Additionally, UX people will spend a lot of effort figuring out how users intuitively expect a piece of software to work, and then companies will spend a bunch of designer and developer time to ensure that their software meets the intuitive expectations of the users as closely as possible (except in cases where meeting intuitive explanations would result in reduced profits).
As such, from the perspective of a non-power user, software works about the way that a typical person would naively expect it to work, except that sometimes it mysteriously breaks for no reason.
And it applies to fields far less technical than AI research and geochemistry. I’ve been a consultant for years. My parents still regularly ask me what it is I actually do.
I suspect that you are attributing far too detailed of a mental model to “the general public” here. Riffing off your xkcd:
I don’t think I’m doing so? I think it could be safely assumed that people have an idea of “software”, and that they know that AI is a type of software. Other than that, I’m largely assuming that they have no specific beliefs about how AI works, a blank map. Which, however, means that when they think of AI, they think about generic “software”, thereby importing their ideas about how software works to AI. And those ideas include “people wrote it”, which is causing the misconception I suspect them to have.
What’s your view on this instead?
I second faul_sname’s point. I have a relative whose business involves editing other people’s photos as a key step. It’s amazing how often she comes across customers who have no idea what a file is, let alone how to attach one to an email. These are sometimes people who have already sent her emails with files attached in the past. Then add all the people who can’t comprehend, after multiple rounds of explanation, that there is a difference between a photo file, like a jpeg, as opposed to a screenshot of their desktop or phone when the photo is pulled up. Yet somehow they know how to take a screenshot and send it to her.
For so many people, their computer (or car, or microwave, etc.) really is just a magic black box anyway, and if it breaks you go to a more powerful wizard to re-enchant it. The idea that it has parts and you can understand how they work is just… not part of the mental process.
Speaking as a software developer who interacts with end-users sometimes, I think you might be surprised at what the mental model of typical software users, rather than developers, looks like. When people who have programmed, or who work a lot with computers, think of “software”, we think of systems which do exactly what we tell them to do, whether or not that is what we meant. However, the world of modern software does its best to hide the sharp edges from users, and the culture of constant A/B tests means that software doesn’t particularly behave the same way day-in and day-out from the perspective of end-users. Additionally, UX people will spend a lot of effort figuring out how users intuitively expect a piece of software to work, and then companies will spend a bunch of designer and developer time to ensure that their software meets the intuitive expectations of the users as closely as possible (except in cases where meeting intuitive explanations would result in reduced profits).
As such, from the perspective of a non-power user, software works about the way that a typical person would naively expect it to work, except that sometimes it mysteriously breaks for no reason.
And it applies to fields far less technical than AI research and geochemistry. I’ve been a consultant for years. My parents still regularly ask me what it is I actually do.