I think this post is confusing qualia and emotions. We could have many different qualia about colors, sounds, even maybe for meanings. There is nothing emotion-specific in qualia.
When we experience an emotions we—maybe—have a separate qualia for that emotion, but this emotion-qialia is only representation of some process for our consciousness, not the process itself. Often, people demonstrate some emotions without knowing that they have the emotion—this is the case of so-called suppressed emotions.
Emotions themselves work as modes of behaviour which we signal to other members of the tribe: that is why we become red when we are angry. Most processing of emotions is happening unconsciously: both generating them and reading others’ emotions is happening automatically.
AI in most cases doesn’t need emotions, but if need, it could be perfect is simulating smile of hate—notjing difficult in it.
AI in most cases doesn’t need emotions, but if need, it could be perfect is simulating smile of hate—notjing difficult in it.
I agree with you that this post seems to be suffering from some kind of confusion because it doesn’t clearly distinguish emotions from the qualia of emotions (or, for anyone allergic to talk of “qualia”, maybe we could just say “experience of emotions” or “thought about emotions”).
I do, however, suspect any sufficiently complex mind may benefit from having something like emotions, assuming it runs in finite hardware, because one of the functions of emotions seems to be to get the mind to operate in a different “mode” than what it otherwise does.
Consider the case of anger in humans as an example. Let’s ignore the qualia of anger for now and just focus on the anger itself. What does it do? I’d summarize the effect as putting the brain in “angry mode” so that whatever comes up we respond in ways that are better optimized for protecting what we value through aggression. Anger, then, conceptualized as emotion, is a slightly altered way of being that put the brain in a state that is better suited to this purpose of “protect with aggression” than what it normally does.
This is necessary because the brain is only so big, and can only be so many ways at once, thus it seems necessary to have a shortcut to putting the brain in a state that favors a particular set of behaviors to make sure those happen.
Thus if we build an AI and it is operating near the limits of its capabilities, then it would benefit from something emotion-like (although we can debate all day whether or not to call it emotion), i.e. a system for putting it self in altered states of cognition that better serve some tasks than others, thus trading off temporarily better performance in one domain/context for worse performance in others. I’m happy to call such a thing emotion, although whether the qualia of noticing such “emotion” from the inside would resemble the human experience of emotion is hard to know at this time.
Agreed. I also thought about the case of “military posture” in national states, which is both preparing to war and singling others what the state is ready to defend itself. However, it seems that emotions are more complex than that (I just downloaded a book called “History of emotions” but didn’t read it yet).
One guess I have is that emotions are connected with older “reptilian” brain, which could process information very quickly and effectively, but limited to local circumstances like immediate surroundings (which is needed to fight or flight reaction). However, another part of emotions is their social framing, as we are trained to interpret raw brain signal as some high-level emotions like love.
I think this post is confusing qualia and emotions. We could have many different qualia about colors, sounds, even maybe for meanings. There is nothing emotion-specific in qualia.
When we experience an emotions we—maybe—have a separate qualia for that emotion, but this emotion-qialia is only representation of some process for our consciousness, not the process itself. Often, people demonstrate some emotions without knowing that they have the emotion—this is the case of so-called suppressed emotions.
Emotions themselves work as modes of behaviour which we signal to other members of the tribe: that is why we become red when we are angry. Most processing of emotions is happening unconsciously: both generating them and reading others’ emotions is happening automatically.
AI in most cases doesn’t need emotions, but if need, it could be perfect is simulating smile of hate—notjing difficult in it.
The question of AI’s qualia is the difficult one.
I agree with you that this post seems to be suffering from some kind of confusion because it doesn’t clearly distinguish emotions from the qualia of emotions (or, for anyone allergic to talk of “qualia”, maybe we could just say “experience of emotions” or “thought about emotions”).
I do, however, suspect any sufficiently complex mind may benefit from having something like emotions, assuming it runs in finite hardware, because one of the functions of emotions seems to be to get the mind to operate in a different “mode” than what it otherwise does.
Consider the case of anger in humans as an example. Let’s ignore the qualia of anger for now and just focus on the anger itself. What does it do? I’d summarize the effect as putting the brain in “angry mode” so that whatever comes up we respond in ways that are better optimized for protecting what we value through aggression. Anger, then, conceptualized as emotion, is a slightly altered way of being that put the brain in a state that is better suited to this purpose of “protect with aggression” than what it normally does.
This is necessary because the brain is only so big, and can only be so many ways at once, thus it seems necessary to have a shortcut to putting the brain in a state that favors a particular set of behaviors to make sure those happen.
Thus if we build an AI and it is operating near the limits of its capabilities, then it would benefit from something emotion-like (although we can debate all day whether or not to call it emotion), i.e. a system for putting it self in altered states of cognition that better serve some tasks than others, thus trading off temporarily better performance in one domain/context for worse performance in others. I’m happy to call such a thing emotion, although whether the qualia of noticing such “emotion” from the inside would resemble the human experience of emotion is hard to know at this time.
Agreed. I also thought about the case of “military posture” in national states, which is both preparing to war and singling others what the state is ready to defend itself. However, it seems that emotions are more complex than that (I just downloaded a book called “History of emotions” but didn’t read it yet).
One guess I have is that emotions are connected with older “reptilian” brain, which could process information very quickly and effectively, but limited to local circumstances like immediate surroundings (which is needed to fight or flight reaction). However, another part of emotions is their social framing, as we are trained to interpret raw brain signal as some high-level emotions like love.