I think this gets deflationary if you think about it, though. Yes, you can apply the intentional stance to the thermostat, but (almost) nobody’s going to get confused and start thinking the thermometer has more fancy abilities like long-term planning just because you say “it wants to keep the room at the right temperature.” Even though you’re using a single word “w.a.n.t.” for both humans and thermostats, you don’t get them mixed up, because your actual representation of what’s going on still distinguishes them based on context. There’s not just one intentional stance, there’s an stance for thermostats and another for humans, and they make different predictions about behavior, even if they’re similar enough that you can call them both intentional stances.
If you buy this, then suddenly applying an intentional stance to LLMs buys you a lot less predictive power, because even intentional stances have a ton of little variables in the mental model they come with, which we will naturally fill in as we learn a stance that works well for LLMs.
I think this gets deflationary if you think about it, though. Yes, you can apply the intentional stance to the thermostat, but (almost) nobody’s going to get confused and start thinking the thermometer has more fancy abilities like long-term planning just because you say “it wants to keep the room at the right temperature.” Even though you’re using a single word “w.a.n.t.” for both humans and thermostats, you don’t get them mixed up, because your actual representation of what’s going on still distinguishes them based on context. There’s not just one intentional stance, there’s an stance for thermostats and another for humans, and they make different predictions about behavior, even if they’re similar enough that you can call them both intentional stances.
If you buy this, then suddenly applying an intentional stance to LLMs buys you a lot less predictive power, because even intentional stances have a ton of little variables in the mental model they come with, which we will naturally fill in as we learn a stance that works well for LLMs.