I do sort of expect AI to have something like emotion, because I think since humans in general tend to focus on the subjective aspect of emotions to the exclusion of their functional roles. I think of emotions as something like “brain modes” designed to reconfigure our limited mental resources to make certain behaviors more likely and other behaviors less likely. Given that AI will also be similarly bounded, I would expect to see something similar allowing the AI to alter its thinking in ways that would feel to the AI subjectively similar to our own experience of emotions and would carry out a similar role of temporarily reconfiguring the AI to be better optimized for particular scenarios.
What those specific emotions would be seems like a thing you could speculate about and *might* be fun and interesting to try but mostly because I think answers will provide hermeneutical fodder for investigating our (possibly confused) beliefs about emotions and AI.
I do sort of expect AI to have something like emotion, because I think since humans in general tend to focus on the subjective aspect of emotions to the exclusion of their functional roles. I think of emotions as something like “brain modes” designed to reconfigure our limited mental resources to make certain behaviors more likely and other behaviors less likely. Given that AI will also be similarly bounded, I would expect to see something similar allowing the AI to alter its thinking in ways that would feel to the AI subjectively similar to our own experience of emotions and would carry out a similar role of temporarily reconfiguring the AI to be better optimized for particular scenarios.
What those specific emotions would be seems like a thing you could speculate about and *might* be fun and interesting to try but mostly because I think answers will provide hermeneutical fodder for investigating our (possibly confused) beliefs about emotions and AI.