If a creature engages in goal-directed activity, then I call it intelligent. If by “having said goal” you mean “consciously intends it”, than I regard the faculties for consciously intending things as a more sophisticated means for aiming at goals. If intercepting the ball is characterized (not defined) as “not intelligent”, that is true relative to some other goal that supercedes it.
I’m basically asserting that the physical evolution of a system towards a goal, in the context of an environment, is what is meant when one distinguishes something that is “intelligent” from something (say, a bottle) that is not. Here, it is important to define “goal” and “environment” very broadly.
Of course, people constantly use the word “intelligence” to mean something more complicated, and higher-level. So, someone might say that a human is definitely “intelligent”, and maybe a chimp, but definitely not a fly. Well, I think that usage is a mistake, because this is a matter of degree. I’m saying that a fly has the “I” in “AI”, just to a lesser degree that a human. One might argue that the fly doesn’t make plans, or use tools, or any number of accessories to intelligence, but I see those faculties as upgrades that raise the degree of intelligence, rather than defining it.
Before you start thinking about “minds” and “cognition”, you’ve got to think about machinery in general. When machinery acquires self-direction (implying something toward which it is directed), a qualitative line is crossed. When machinery acquires faculties or techniques that improve self-direction, I think that is more appropriately considered quantitative.
If a creature engages in goal-directed activity, then I call it intelligent. If by “having said goal” you mean “consciously intends it”, than I regard the faculties for consciously intending things as a more sophisticated means for aiming at goals. If intercepting the ball is characterized (not defined) as “not intelligent”, that is true relative to some other goal that supercedes it.
I’m basically asserting that the physical evolution of a system towards a goal, in the context of an environment, is what is meant when one distinguishes something that is “intelligent” from something (say, a bottle) that is not. Here, it is important to define “goal” and “environment” very broadly.
Of course, people constantly use the word “intelligence” to mean something more complicated, and higher-level. So, someone might say that a human is definitely “intelligent”, and maybe a chimp, but definitely not a fly. Well, I think that usage is a mistake, because this is a matter of degree. I’m saying that a fly has the “I” in “AI”, just to a lesser degree that a human. One might argue that the fly doesn’t make plans, or use tools, or any number of accessories to intelligence, but I see those faculties as upgrades that raise the degree of intelligence, rather than defining it.
Before you start thinking about “minds” and “cognition”, you’ve got to think about machinery in general. When machinery acquires self-direction (implying something toward which it is directed), a qualitative line is crossed. When machinery acquires faculties or techniques that improve self-direction, I think that is more appropriately considered quantitative.