An entity that didn’t care about goals would never do anything at all.
I agree with the rest of your comment, and depending on how you define “goal” with the quote as well. However, what about entities driven only by heuristics? Those may have developed to pursue a goal, but not necessarily so. Would you call an agent that is only heuristics-driven goal-oriented? (I have in mind simple commands along the lines of “go left when there is a light on the right”, think Braitenberg vehicles minus the evolutionary aspect.
Yes, I thought about that when writing the above, but I figured I’d fall back on the term “entity”. ;) An entity would be something that could have goals (sidestepping the hard work of exactly what object qualify).
I agree with the rest of your comment, and depending on how you define “goal” with the quote as well. However, what about entities driven only by heuristics? Those may have developed to pursue a goal, but not necessarily so. Would you call an agent that is only heuristics-driven goal-oriented? (I have in mind simple commands along the lines of “go left when there is a light on the right”, think Braitenberg vehicles minus the evolutionary aspect.
Yes, I thought about that when writing the above, but I figured I’d fall back on the term “entity”. ;) An entity would be something that could have goals (sidestepping the hard work of exactly what object qualify).
See also
Hard to be original anymore. Which is a good sign!