You are addressing the clothes and telling them they have no emperor. They can’t hear you.
But Dennett is sort of besides the point here. I can build a simple agent ecosystem in LISP, and nobody would suggest there is anything conscious there. “Agent” talk as applied to such a LISP program would just be a useful modeling technique. An “agent” could just be “something with a utility function that can act,” not “conscious self.”
In fact, in the kinds of dilemmas humans face that the OP discusses, often some of the “agents” in question are something very old and pre-verbal and (regardless of your stance on consciousness) not very conscious at all. This does not prevent them from leaving a large footprint on our mental landscape.
You are addressing the clothes and telling them they have no emperor. They can’t hear you.
But Dennett is sort of besides the point here. I can build a simple agent ecosystem in LISP, and nobody would suggest there is anything conscious there. “Agent” talk as applied to such a LISP program would just be a useful modeling technique. An “agent” could just be “something with a utility function that can act,” not “conscious self.”
In fact, in the kinds of dilemmas humans face that the OP discusses, often some of the “agents” in question are something very old and pre-verbal and (regardless of your stance on consciousness) not very conscious at all. This does not prevent them from leaving a large footprint on our mental landscape.