The description of our feelings is not fundamentally different from the description of any reinforcement learner. They both describe the same thing—physical reality—just with different language and precision.
I see no reason to attribute emotional states to any of these things.
The reason is that they are abstractly analogous to emotional states in humans, like emotional state in one human may be abstractly analogous to emotional state in other human.
The description of our feelings is not fundamentally different from the description of any reinforcement learner. They both describe the same thing—physical reality—just with different language and precision.
The reason is that they are abstractly analogous to emotional states in humans, like emotional state in one human may be abstractly analogous to emotional state in other human.
I cannot see “abstractly analogous” as sufficient grounds. Get abstract enough and everything is “abstractly analogous” to everything.