This program aimlessly wanders over a space of locations, but eventually tends to avoid locations where X has returned True at past times. It seems obvious to me that X is pain, and that this program experiences pain. You might say that the program experiences less pain than we do, because the pain response is so simple. Or you might argue that it experiences pain more intensely, because all it does is implement the pain response. Either position seems valid, but again it’s all academic to me, because I don’t believe pain or pleasure are good or bad things in themselves.
To answer your question, a thermostat that is blocked from changing the temperature is frustrated, not necessarily in pain. Although, changing the setting on a working thermostat may be pain, because it is a stimulus that causes a change in the persistent behavior a system, directing it to extricate itself from its current situation.
I don’t think a program has to be very sophisticated to feel pain. But it does have to exhibit some kind of learning. For example:
This program aimlessly wanders over a space of locations, but eventually tends to avoid locations where X has returned True at past times. It seems obvious to me that X is pain, and that this program experiences pain. You might say that the program experiences less pain than we do, because the pain response is so simple. Or you might argue that it experiences pain more intensely, because all it does is implement the pain response. Either position seems valid, but again it’s all academic to me, because I don’t believe pain or pleasure are good or bad things in themselves.
To answer your question, a thermostat that is blocked from changing the temperature is frustrated, not necessarily in pain. Although, changing the setting on a working thermostat may be pain, because it is a stimulus that causes a change in the persistent behavior a system, directing it to extricate itself from its current situation.
(edit: had trouble with indentation.)