You seem to be using the words “goal-directed” differently than the OP.
And in different ways throughout your comment.
Is a bottle cap goal directed? Sure, it was created to keep stuff in, and it keeps doing a fine job of that.
It is achieving a purpose. (State of the world.)
Conversely, am I goal directed? Maybe not: I just keep doing stuff and it’s only after the fact that I can construct a story that says I was aiming to some goal.
You seem to have a higher standard for people. I imagine you exhibit goal-directed behavior with the aim of maintaining certain equilibria/homeostasis—eating, sleeping, as well as more complicated behaviors to enable those. This is more than a bottle cap does, and more difficult a job than performed by a thermostat.
Is a paperclip maximizer goal directed? Maybe not: it just makes paperclips because it’s programmed to and has no idea that that’s what it’s doing, no more than the bottle cap knows it’s holding in liquid or the twitch robot knows it’s twitching.
This sounds like is a machine that makes paperclips, without optimizing—not a maximizer. (Unless the twitching robot is a maximizer.) “Opt” means “to make a choice (from a range of possibilities)”—you do this, the other things not so much.
goals are a feature of the map, not the unmapped territory.
You don’t think that a map of the world (including the agents in it) would include goals? (I can imagine a counterfactual where someone is put in different circumstances, but continues to pursue the same ends, at least at a basic level—eating, sleeping, etc.)
You seem to be using the words “goal-directed” differently than the OP.
And in different ways throughout your comment.
That’s a manifestation of my point: what it would mean for something to be a goal seems to be able to shift depending on what it is you think is an important feature of the thing that would have the goal.
You seem to be using the words “goal-directed” differently than the OP.
And in different ways throughout your comment.
It is achieving a purpose. (State of the world.)
You seem to have a higher standard for people. I imagine you exhibit goal-directed behavior with the aim of maintaining certain equilibria/homeostasis—eating, sleeping, as well as more complicated behaviors to enable those. This is more than a bottle cap does, and more difficult a job than performed by a thermostat.
This sounds like is a machine that makes paperclips, without optimizing—not a maximizer. (Unless the twitching robot is a maximizer.) “Opt” means “to make a choice (from a range of possibilities)”—you do this, the other things not so much.
You don’t think that a map of the world (including the agents in it) would include goals? (I can imagine a counterfactual where someone is put in different circumstances, but continues to pursue the same ends, at least at a basic level—eating, sleeping, etc.)
That’s a manifestation of my point: what it would mean for something to be a goal seems to be able to shift depending on what it is you think is an important feature of the thing that would have the goal.