How does the definition you link to contradict Rand’s statement? You can acknowledge emotions as real while denying their usefulness in your cognitive process.
Rand doesn’t deny that emotions are part of rationality, she denies that they are tools of rationality. It is rational to try to make yourself experience positive emotions, but to say “I have a good feeling about this” is not a rational statement, it’s an emotional statement. It isn’t something that should interfere with cognition.
As for emotions affecting humans behavior, I think all mammals have emotions, so it’s not easy for humans to discard them over a few generations of technological evolution. Emotions were useful in the ancestral environment, they are no longer as useful as they once were.
but to say “I have a good feeling about this” is not a rational statement, it’s an emotional statement.
If your hunches have a bad track record, then you should learn to ignore them, but if they do work, then ignoring them is irrational.
Even if emotions are suboptimal tools in virtually all cases (which I find unlikely), that doesn’t mean that ignoring them is a good idea. It’s like how getting rid of overconfidence bias and risk aversion is good, but getting rid of overconfidence bias OR risk aversion is a terrible idea. Everything we’ve added since emotion was built around emotion. If emotion will give you an irrational bias, then you’ll evolve a counter bias elsewhere.
If your hunches have a bad track record, then you should learn to ignore them, but if they do work, then ignoring them is irrational.
If your hunches have a good track record, I think you should explore that and come up with a rational explanation, and make sure it’s not just a coincidence. Additionally, while following your hunches isn’t inherently bad, rational people shouldn’t be convinced of an argument merely based on somebody else’s hunch.
Even if emotions are suboptimal tools in virtually all cases (which I find unlikely), that doesn’t mean that ignoring them is a good idea.
Nobody is suggesting we ignore emotions, merely that we don’t let them interfere with rational thought (in practice this is very difficult).
It’s like how getting rid of overconfidence bias and risk aversion is good, but getting rid of overconfidence bias OR risk aversion is a terrible idea. .
I don’t follow this argument. Your biases can be evaluated absolutely, or relative to the general population. If everybody is biased underconfidence, the being biased in towards overconfidence can be an advantage. There’s a similar argument for risk aversion.
Everything we’ve added since emotion was built around emotion. If emotion will give you an irrational bias, then you’ll evolve a counter bias elsewhere
I’m not sure I agree with this, do you think that The Big Bang Theory is based on emotion? You can draw a path from emotion to the people who came up with the Big Bang Theory, but you can do that with things other than emotion as well.
My issue with emotions is only partly that they cause biases, it’s also that you can’t rely on other people having the same emotions as you. So you can use emotions to better understand your own goals. But you won’t be able to convince people who don’t know your emotions that your goals are worth achieving.
If your hunches have a good track record, I think you should explore that and come up with a rational explanation, and make sure it’s not just a coincidence.
My explanation is that hunches are based on aggregate data that you are not capable of tracking explicitly.
Additionally, while following your hunches isn’t inherently bad, rational people shouldn’t be convinced of an argument merely based on somebody else’s hunch.
Hunches aren’t scientific. They’re not good for social things. Anyone can claim to have a hunch. That being said, if you trust someone to be honest, and you know the track record of their hunches, there’s no less reason to trust their hunches than your own.
Nobody is suggesting we ignore emotions, merely that we don’t let them interfere with rational thought (in practice this is very difficult).
I mean ignore the emotion for the purposes of coming up with a solution.
I don’t follow this argument.
Overconfidence bias causes you to take too many risks. Risk aversion causes you to take too few risks. I doubt they counter each other out that well. It’s probably for the best to get rid of both. But I’d bet that getting rid of just one of them, causing you to either consistently take too many risks or consistently take too few, would be worse than keeping both of them.
I’m not sure I agree with this, do you think that The Big Bang Theory is based on emotion?
Emotions are more about considering theories than finding them. That being said, you don’t come up with theories all at once. Your emotions will be part of how you refine the theories, and they will be involved in training whatever heuristics you use.
You can draw a path from emotion to the people who came up with the Big Bang Theory, but you can do that with things other than emotion as well.
I’m certainly not arguing that rationality is entirely about emotion. Anything with a significant effect on your cognition should be strongly considered for rationality before you reject it.
So you can use emotions to better understand your own goals. But you won’t be able to convince people who don’t know your emotions that your goals are worth achieving.
This looks like you’re talking about terminal values. The utility function is not up for grabs. You can’t convince a rational agent that your goals are worth achieving regardless of the method you use. Am I misunderstanding this comment?
The only part I object to what you wrote is emotions shouldn’t interfere with cognition. I think they already are a part of cognition and it’s a bit like calling “quantum physics is weird”. Perhaps you meant “emotions shouldn’t interfere with rationality” in which case I’ll observe that it doesn’t seem to be a popular view around lesswrong. Also observe, I used to believe that emotions should be ignored, but later came to the conclusion that it’s a way too heavy-handed strategy for the modern world of complex systems. I’ll try to conjecture further, by saying, cog, psychologists tend to classify emotion, affect, and moods differently. AFAIK, it’s based on the temporal duration it exists with short—long in order of emotion, mood, affect. My conjecture is emotions can and should be ignored, mood can be ignored ( but not necessarily should) and affect should not be ignored, while rational decision-making.
The only part I object to what you wrote is emotions shouldn’t interfere with cognition.
This is an ideal which Objectivists believe in, but it is difficult/impossible to actually achieve. I’ve noticed that as I’ve gotten older, emotions interfere with my cognition less and less and I am happy about that. You can define cognition how you wish, but given the number of people who see it as separate from emotion it’s probably worth having a backup definition in case you want to talk to those people.
RE: emotions, affect, moods. I do think that emotions should be considered when making rational decisions, but they are not the tools by which we come to decisions, here’s an example.
If you want to build a house to shelter your family, your emotional connection to your family is not a tool you will use to build the house. It’s important to have a strong motivation to do something, but that motivation is not a tool. You’ll still need hammers, drills, etc to build the house.
I believe we can and should use drugs (I include naturally occurring hormones) to modify our emotions in order to better achieve our goals.
Very interesting… it would seem that Rand doesn’t actually define emotion consistently, that was not the definition I was using. But the Ayn Rand Lexicon has 11 different passages related to emotions.
More charitably we could say her conception of emotions evolved over time. Thanks for the link, I actually found some of that insightful. Also, I had forgotten how blank slatey her theory of mind was.
“Emotions are not tools of cognition”
Ayn Rand
I beg to differ. Or are you saying that, if Ayn Rand says it, it must be wrong? In which case, I still disagree.
How does the definition you link to contradict Rand’s statement? You can acknowledge emotions as real while denying their usefulness in your cognitive process.
The article I linked to wasn’t just saying that emotions exist. It was saying that they’re part of rationality.
If emotions didn’t make people behave rationally, then people wouldn’t evolve to have emotions.
Rand doesn’t deny that emotions are part of rationality, she denies that they are tools of rationality. It is rational to try to make yourself experience positive emotions, but to say “I have a good feeling about this” is not a rational statement, it’s an emotional statement. It isn’t something that should interfere with cognition.
As for emotions affecting humans behavior, I think all mammals have emotions, so it’s not easy for humans to discard them over a few generations of technological evolution. Emotions were useful in the ancestral environment, they are no longer as useful as they once were.
If your hunches have a bad track record, then you should learn to ignore them, but if they do work, then ignoring them is irrational.
Even if emotions are suboptimal tools in virtually all cases (which I find unlikely), that doesn’t mean that ignoring them is a good idea. It’s like how getting rid of overconfidence bias and risk aversion is good, but getting rid of overconfidence bias OR risk aversion is a terrible idea. Everything we’ve added since emotion was built around emotion. If emotion will give you an irrational bias, then you’ll evolve a counter bias elsewhere.
If your hunches have a good track record, I think you should explore that and come up with a rational explanation, and make sure it’s not just a coincidence. Additionally, while following your hunches isn’t inherently bad, rational people shouldn’t be convinced of an argument merely based on somebody else’s hunch.
Nobody is suggesting we ignore emotions, merely that we don’t let them interfere with rational thought (in practice this is very difficult).
I don’t follow this argument. Your biases can be evaluated absolutely, or relative to the general population. If everybody is biased underconfidence, the being biased in towards overconfidence can be an advantage. There’s a similar argument for risk aversion.
I’m not sure I agree with this, do you think that The Big Bang Theory is based on emotion? You can draw a path from emotion to the people who came up with the Big Bang Theory, but you can do that with things other than emotion as well.
My issue with emotions is only partly that they cause biases, it’s also that you can’t rely on other people having the same emotions as you. So you can use emotions to better understand your own goals. But you won’t be able to convince people who don’t know your emotions that your goals are worth achieving.
My explanation is that hunches are based on aggregate data that you are not capable of tracking explicitly.
Hunches aren’t scientific. They’re not good for social things. Anyone can claim to have a hunch. That being said, if you trust someone to be honest, and you know the track record of their hunches, there’s no less reason to trust their hunches than your own.
I mean ignore the emotion for the purposes of coming up with a solution.
Overconfidence bias causes you to take too many risks. Risk aversion causes you to take too few risks. I doubt they counter each other out that well. It’s probably for the best to get rid of both. But I’d bet that getting rid of just one of them, causing you to either consistently take too many risks or consistently take too few, would be worse than keeping both of them.
Emotions are more about considering theories than finding them. That being said, you don’t come up with theories all at once. Your emotions will be part of how you refine the theories, and they will be involved in training whatever heuristics you use.
I’m certainly not arguing that rationality is entirely about emotion. Anything with a significant effect on your cognition should be strongly considered for rationality before you reject it.
This looks like you’re talking about terminal values. The utility function is not up for grabs. You can’t convince a rational agent that your goals are worth achieving regardless of the method you use. Am I misunderstanding this comment?
The only part I object to what you wrote is emotions shouldn’t interfere with cognition. I think they already are a part of cognition and it’s a bit like calling “quantum physics is weird”. Perhaps you meant “emotions shouldn’t interfere with rationality” in which case I’ll observe that it doesn’t seem to be a popular view around lesswrong. Also observe, I used to believe that emotions should be ignored, but later came to the conclusion that it’s a way too heavy-handed strategy for the modern world of complex systems. I’ll try to conjecture further, by saying, cog, psychologists tend to classify emotion, affect, and moods differently. AFAIK, it’s based on the temporal duration it exists with short—long in order of emotion, mood, affect. My conjecture is emotions can and should be ignored, mood can be ignored ( but not necessarily should) and affect should not be ignored, while rational decision-making.
This is an ideal which Objectivists believe in, but it is difficult/impossible to actually achieve. I’ve noticed that as I’ve gotten older, emotions interfere with my cognition less and less and I am happy about that. You can define cognition how you wish, but given the number of people who see it as separate from emotion it’s probably worth having a backup definition in case you want to talk to those people.
RE: emotions, affect, moods. I do think that emotions should be considered when making rational decisions, but they are not the tools by which we come to decisions, here’s an example.
If you want to build a house to shelter your family, your emotional connection to your family is not a tool you will use to build the house. It’s important to have a strong motivation to do something, but that motivation is not a tool. You’ll still need hammers, drills, etc to build the house.
I believe we can and should use drugs (I include naturally occurring hormones) to modify our emotions in order to better achieve our goals.
This seems to be in tension with what she has stated elsewhere. For instance:
-- Ayn Rand, Philosophy: Who Needs It?
Wouldn’t immediately available estimates be a good tool of cognition?
Very interesting… it would seem that Rand doesn’t actually define emotion consistently, that was not the definition I was using. But the Ayn Rand Lexicon has 11 different passages related to emotions.
http://aynrandlexicon.com/lexicon/emotions.html
More charitably we could say her conception of emotions evolved over time. Thanks for the link, I actually found some of that insightful. Also, I had forgotten how blank slatey her theory of mind was.