Sorry, I don’t understand—why does sum of probabilities not equal 100% in your example? Assume that you missed “5” in “P(X|I believe X won’t happen) = 1%”
Perhaps I would be better off being deluded about it.
These probabilities are not required to sum to 1, because they are not incompatible and exhaustive possible outcomes of an experiment. More obvious example to illustrate:
P(6-sided die coming up as 6 | today is Monday) = 1⁄6 P(6-sided die coming up as 6 | today is not Monday) = 1⁄6 1⁄6 + 1⁄6 != 1
I think your example is not suitable for situation above—there I can see only two possible outcomes: X happen or X not happen. We don’t know anything more about X. And P(X|A) + P(X|~A) = 1, isn’t so?
Yes, either X happens or X doesn’t happen. P(X) + P(~X) = 1, so therefore P(X | A) + P(~X | A) = 1. Both formulations are stating the probability of X. But one is adjusting for the probability of X given A; so either X given A happens or X given A doesn’t happen (which is P(~X | A) not P(X | ~A)).
When Pinker said “better off”, I assumed he included goal achievement. It’s plausible that people are more motivated to do something if they’re more certain than they should be based on the evidence. They might not try as hard otherwise, which will influence the probability that the goal is attained. I don’t really know if that’s true, though.
The thing may be worth doing even if the probability isn’t high that it will succeed, because the expected value could be high. But if one isn’t delusionally certain that one will be successful, it may no longer be worth doing because the probability that the attempt succeeds is lower. (That was the point of my first comment.)
There could be other psychological effects of knowing certain things. For example, maybe it would be difficult to handle being completely objective about one’s own flaws and so on. Being objective about people you know may (conceivably) harm your relationships. Having to lie is uncomfortable. Knowing a completely useless but embarrassing fact about someone but pretending you don’t is uncomfortable, not simply a harmless, unimportant update of your map of the territory. Etc.
I’m not saying I know of any general way to avoid harmful knowledge, but that doesn’t mean it doesn’t exist.
It’s not obvious that one is better off with the truth. Assume that for some desirable thing X:
P(X|I believe X will happen) = 49%
P(X|I believe X won’t happen) = 1%
It seems I can’t rationally believe that X will happen. Perhaps I would be better off being deluded about it.
Sorry, I don’t understand—why does sum of probabilities not equal 100% in your example? Assume that you missed “5” in “P(X|I believe X won’t happen) = 1%”
But for what reason?
These probabilities are not required to sum to 1, because they are not incompatible and exhaustive possible outcomes of an experiment. More obvious example to illustrate:
P(6-sided die coming up as 6 | today is Monday) = 1⁄6
P(6-sided die coming up as 6 | today is not Monday) = 1⁄6
1⁄6 + 1⁄6 != 1
I think your example is not suitable for situation above—there I can see only two possible outcomes: X happen or X not happen. We don’t know anything more about X. And P(X|A) + P(X|~A) = 1, isn’t so?
No. You may have confused it with P(X|A) + P(~X|A) = 1 (note the tilda). In my case, either 6-sided die comes up as 6, or it doesn’t.
Yes, either X happens or X doesn’t happen. P(X) + P(~X) = 1, so therefore P(X | A) + P(~X | A) = 1. Both formulations are stating the probability of X. But one is adjusting for the probability of X given A; so either X given A happens or X given A doesn’t happen (which is P(~X | A) not P(X | ~A)).
When Pinker said “better off”, I assumed he included goal achievement. It’s plausible that people are more motivated to do something if they’re more certain than they should be based on the evidence. They might not try as hard otherwise, which will influence the probability that the goal is attained. I don’t really know if that’s true, though.
The thing may be worth doing even if the probability isn’t high that it will succeed, because the expected value could be high. But if one isn’t delusionally certain that one will be successful, it may no longer be worth doing because the probability that the attempt succeeds is lower. (That was the point of my first comment.)
There could be other psychological effects of knowing certain things. For example, maybe it would be difficult to handle being completely objective about one’s own flaws and so on. Being objective about people you know may (conceivably) harm your relationships. Having to lie is uncomfortable. Knowing a completely useless but embarrassing fact about someone but pretending you don’t is uncomfortable, not simply a harmless, unimportant update of your map of the territory. Etc.
I’m not saying I know of any general way to avoid harmful knowledge, but that doesn’t mean it doesn’t exist.