If the information that N chooses to remain ignorant of happens to be of little relevance to any decision N will take in the future, and if his self-deception allows him to be more confident than he would have been otherwise, and if this increased confidence grants him a significant advantage, then the right side of the equation will be bigger than the left side.
Not quite.
The information could be of high relevance, but it could so happen that it won’t cause him to change his mind.
He could be choosing among close alternatives, so switching to a slightly better alternative could be of limited value.
Remember also that failure to search for disconfirming evidence doesn’t necessarily constitute self-deception.
It is? Why do you think people are pleasantly surprised when they get lucky, if not because it’s a rare occurrence?
Sorry, I guess your definition of luck was reasonable. But in this case, it’s not necessarily true that the probability of the right side being greater is lower than 50%. In which case you wouldn’t always have to “get lucky”.
I’ve been thinking about this on and off for an hour, and I’ve come to the conclusion that you’re right.
My mistake comes from the fact that the examples I was using to think about this were all examples where one has low certainty about whether the information is irrelevant to one’s decision making. In this case, the odds are that being ignorant will yield a less than maximal chance of success. However, there are situations in which it’s possible to know with great certainty that some piece of information is irrelevant to one’s decision making, even if you don’t know what the information is. These situations are mostly those that are limited in scope and involve a short-term goal, like giving a favorable first impression, or making a good speech. For instance, you might suspect that your audience hates your guts, and knowing that this is in fact the case would make you less confident during your speech than merely suspecting it, so you’d be better off waiting after the speech to find out about this particular fact.
Although, if I were in that situation, and they did hate my guts, I’d rather know about it and find a way to remain confident that doesn’t involve willful ignorance. That said, I have no difficulty imagining a person who is simply incapable of finding such a way.
I wonder, do all situations where instrumental rationality conflicts with epistemic rationality have to do with mental states over which we have no conscious control?
I’ve been thinking about this on and off for an hour, and I’ve come to the conclusion that you’re right.
Wow, this must be like the 3rd time that someone on the internet has said that to me! Thanks!
Although, if I were in that situation, and they did hate my guts, I’d rather know about it and find a way to remain confident that doesn’t involve willful ignorance.
If you think of a way, please tell me about it.
I wonder, do all situations where instrumental rationality conflicts with epistemic rationality have to do with mental states over which we have no conscious control?
Information you have to pay money for doesn’t fit into this category.
Not quite.
The information could be of high relevance, but it could so happen that it won’t cause him to change his mind.
He could be choosing among close alternatives, so switching to a slightly better alternative could be of limited value.
Remember also that failure to search for disconfirming evidence doesn’t necessarily constitute self-deception.
Sorry, I guess your definition of luck was reasonable. But in this case, it’s not necessarily true that the probability of the right side being greater is lower than 50%. In which case you wouldn’t always have to “get lucky”.
I’ve been thinking about this on and off for an hour, and I’ve come to the conclusion that you’re right.
My mistake comes from the fact that the examples I was using to think about this were all examples where one has low certainty about whether the information is irrelevant to one’s decision making. In this case, the odds are that being ignorant will yield a less than maximal chance of success. However, there are situations in which it’s possible to know with great certainty that some piece of information is irrelevant to one’s decision making, even if you don’t know what the information is. These situations are mostly those that are limited in scope and involve a short-term goal, like giving a favorable first impression, or making a good speech. For instance, you might suspect that your audience hates your guts, and knowing that this is in fact the case would make you less confident during your speech than merely suspecting it, so you’d be better off waiting after the speech to find out about this particular fact.
Although, if I were in that situation, and they did hate my guts, I’d rather know about it and find a way to remain confident that doesn’t involve willful ignorance. That said, I have no difficulty imagining a person who is simply incapable of finding such a way.
I wonder, do all situations where instrumental rationality conflicts with epistemic rationality have to do with mental states over which we have no conscious control?
Wow, this must be like the 3rd time that someone on the internet has said that to me! Thanks!
If you think of a way, please tell me about it.
Information you have to pay money for doesn’t fit into this category.