I have never thought about this, so this is a serious question. Why do you think evolution resulted in beings with emotions and what makes you confident enough that emotions are unnecessary for practical agents that you would end up being frustrated about the depiction of emotional AIs created by emotional beings in SF stories?
From Wikipedia:
Emotion is often the driving force behind motivation, positive or negative. An alternative definition of emotion is a “positive or negative experience that is associated with a particular pattern of physiological activity.”
...cognition is an important aspect of emotion, particularly the interpretation of events.
Let’s say the AI in your story becomes aware of an imminent and unexpected threat and allocates most resources to dealing with it. This sounds like fear. The rest is semantics. Or how exactly would you tell that the AI is not in fear? I think we’ll quickly come up against the hard problem of consciousness here and whether consciousness is an important feature for agents to possess. And I don’t think one can be confident enough about this issue in order to become frustrated about a science fiction author using emotional terminology to describe the AIs in their story (a world in which AIs have “emotions” is not too absurd).
I have never thought about this, so this is a serious question. Why do you think evolution resulted in beings with emotions and what makes you confident enough that emotions are unnecessary for practical agents that you would end up being frustrated about the depiction of emotional AIs created by emotional beings in SF stories?
From Wikipedia:
Let’s say the AI in your story becomes aware of an imminent and unexpected threat and allocates most resources to dealing with it. This sounds like fear. The rest is semantics. Or how exactly would you tell that the AI is not in fear? I think we’ll quickly come up against the hard problem of consciousness here and whether consciousness is an important feature for agents to possess. And I don’t think one can be confident enough about this issue in order to become frustrated about a science fiction author using emotional terminology to describe the AIs in their story (a world in which AIs have “emotions” is not too absurd).