If your hunches have a good track record, I think you should explore that and come up with a rational explanation, and make sure it’s not just a coincidence.
My explanation is that hunches are based on aggregate data that you are not capable of tracking explicitly.
Additionally, while following your hunches isn’t inherently bad, rational people shouldn’t be convinced of an argument merely based on somebody else’s hunch.
Hunches aren’t scientific. They’re not good for social things. Anyone can claim to have a hunch. That being said, if you trust someone to be honest, and you know the track record of their hunches, there’s no less reason to trust their hunches than your own.
Nobody is suggesting we ignore emotions, merely that we don’t let them interfere with rational thought (in practice this is very difficult).
I mean ignore the emotion for the purposes of coming up with a solution.
I don’t follow this argument.
Overconfidence bias causes you to take too many risks. Risk aversion causes you to take too few risks. I doubt they counter each other out that well. It’s probably for the best to get rid of both. But I’d bet that getting rid of just one of them, causing you to either consistently take too many risks or consistently take too few, would be worse than keeping both of them.
I’m not sure I agree with this, do you think that The Big Bang Theory is based on emotion?
Emotions are more about considering theories than finding them. That being said, you don’t come up with theories all at once. Your emotions will be part of how you refine the theories, and they will be involved in training whatever heuristics you use.
You can draw a path from emotion to the people who came up with the Big Bang Theory, but you can do that with things other than emotion as well.
I’m certainly not arguing that rationality is entirely about emotion. Anything with a significant effect on your cognition should be strongly considered for rationality before you reject it.
So you can use emotions to better understand your own goals. But you won’t be able to convince people who don’t know your emotions that your goals are worth achieving.
This looks like you’re talking about terminal values. The utility function is not up for grabs. You can’t convince a rational agent that your goals are worth achieving regardless of the method you use. Am I misunderstanding this comment?
My explanation is that hunches are based on aggregate data that you are not capable of tracking explicitly.
Hunches aren’t scientific. They’re not good for social things. Anyone can claim to have a hunch. That being said, if you trust someone to be honest, and you know the track record of their hunches, there’s no less reason to trust their hunches than your own.
I mean ignore the emotion for the purposes of coming up with a solution.
Overconfidence bias causes you to take too many risks. Risk aversion causes you to take too few risks. I doubt they counter each other out that well. It’s probably for the best to get rid of both. But I’d bet that getting rid of just one of them, causing you to either consistently take too many risks or consistently take too few, would be worse than keeping both of them.
Emotions are more about considering theories than finding them. That being said, you don’t come up with theories all at once. Your emotions will be part of how you refine the theories, and they will be involved in training whatever heuristics you use.
I’m certainly not arguing that rationality is entirely about emotion. Anything with a significant effect on your cognition should be strongly considered for rationality before you reject it.
This looks like you’re talking about terminal values. The utility function is not up for grabs. You can’t convince a rational agent that your goals are worth achieving regardless of the method you use. Am I misunderstanding this comment?