Do we have a clear idea what we mean when I say agent?
Is a Roomba, the robot vacuum cleaner that adapts to walls, furniture, the rate at which the floor gets dirty, and other things, an agent? I don’t think so.
Is an air conditioner with a thermostat which tells it to cool the rooms to 22C when people are present or likely to be present, but not to cool it when people are absent or likely to be absent an agent? I think not.
Is a troubleshooting guide with lots of if-then-else branch points an agent? No.
Consider a tool that I write which will write a program to solve a problem I am interested in solving. Let say I want to build a medical robot which will wander the world dispensing medical care to anyone who seeks its help. The code I write to implement this has a lot of recursion in the sense that my code looks at symptoms of the current patient and writes treatment code based on its database and the symptoms it sees, and modified the treatment code based on reactions of the patient.
As long as this robot continues to treat humans medically, it does not seem at all agenty to me. If it started to develop nutrition programs and clean water programs, it would seem somewhat agenty to me. Until it switched professions, decided to be a poet or a hermit or a barista, I would not think of it as an agent.
As long as my tool is doing what I designed it to do, I don’t think it shows any signs of wanting anything.
Is a Roomba, the robot vacuum cleaner that adapts to walls, furniture, the rate at which the floor gets dirty, and other things, an agent?
It’s a textbook case of an agent in the AI field. (Really! IIRC AI: A Modern Approach uses Roomba in its introductory chapters as an example of an agent.)
We may need to taboo the word agent, since it has technical meanings here.
Do we have a clear idea what we mean when I say agent?
Is a Roomba, the robot vacuum cleaner that adapts to walls, furniture, the rate at which the floor gets dirty, and other things, an agent? I don’t think so.
Is an air conditioner with a thermostat which tells it to cool the rooms to 22C when people are present or likely to be present, but not to cool it when people are absent or likely to be absent an agent? I think not.
Is a troubleshooting guide with lots of if-then-else branch points an agent? No.
Consider a tool that I write which will write a program to solve a problem I am interested in solving. Let say I want to build a medical robot which will wander the world dispensing medical care to anyone who seeks its help. The code I write to implement this has a lot of recursion in the sense that my code looks at symptoms of the current patient and writes treatment code based on its database and the symptoms it sees, and modified the treatment code based on reactions of the patient.
As long as this robot continues to treat humans medically, it does not seem at all agenty to me. If it started to develop nutrition programs and clean water programs, it would seem somewhat agenty to me. Until it switched professions, decided to be a poet or a hermit or a barista, I would not think of it as an agent.
As long as my tool is doing what I designed it to do, I don’t think it shows any signs of wanting anything.
It’s a textbook case of an agent in the AI field. (Really! IIRC AI: A Modern Approach uses Roomba in its introductory chapters as an example of an agent.)
We may need to taboo the word agent, since it has technical meanings here.
Hopefully where “taboo” means explain.
What if your robot searched the medical literature for improved treatments? What if it improved its ability to find treatments?