Indeed, a little too similar to Dennett’s intentional stance. If people don’t really have goals, but it is merely convenient to pretend they do, then the idea that people really have beliefs would seem to be in equal jeopardy. And then truth-seeking is in double jeopardy. But the trouble is, all along I’ve been trying to seek the truth about this blue-minimizing robot and related puzzles. I’ve been treating myself as an intentional system, something with both beliefs and goals, including goals about beliefs. And what I’ve just been told, it seems, is that my goals (or “goals”) will not be satisfied by this approach. OK then, I’ll turn elsewhere.
If there is some definition or criterion of “having goals” that human beings don’t meet—the von Neumann-Morgenstern utility theory, for example—it’s easy enough to discard that definition or criterion.
Indeed, a little too similar to Dennett’s intentional stance. If people don’t really have goals, but it is merely convenient to pretend they do, then the idea that people really have beliefs would seem to be in equal jeopardy. And then truth-seeking is in double jeopardy. But the trouble is, all along I’ve been trying to seek the truth about this blue-minimizing robot and related puzzles. I’ve been treating myself as an intentional system, something with both beliefs and goals, including goals about beliefs. And what I’ve just been told, it seems, is that my goals (or “goals”) will not be satisfied by this approach. OK then, I’ll turn elsewhere.
If there is some definition or criterion of “having goals” that human beings don’t meet—the von Neumann-Morgenstern utility theory, for example—it’s easy enough to discard that definition or criterion.