And you seriously believe that, in all circumstances and for all people with any false belief, those people are better off believing the truth concerning that belief? The obvious counterexample is the placebo effect, where a false belief is scientifically proven to have a benefit. The beneficial effects of false beliefs are so powerful, that you can’t conduct a pharmaceutical study without accounting for them. And you are no doubt familiar with that effect. Another example would be believing that you’re never better off believing a false belief, because then you have more incentive to investigate suspicious beliefs.
The difficult epistemic state to get into is justifiably believing that you’re better off believing falsely about something without already, in some sense, knowing the truth about it.
It’s actually very easy and common to believe that you’re better off believing X, whether or not X is true, without knowing the truth about it. This is also well-justified in decision theory, and by your definition of rationality, if believing X will help you win. A common example is choosing to believe that your date has an “average romantic history” and choosing not to investigate.
If you think you can’t do this, I propose this math problem. Using a random number generator over all American citizens, I have selected Bob (but not identified him to you). If you can guess Bob’s IQ (with margin of error +/- 5 points), you get a prize. Do you think it is possible for Bob’s IQ to be higher or lower than you expected, and if so, do you believe you’re better off not having any expectation at all rather than a potentially false expectation? See, as soon as a question is asked, you fill in the answer with [expected answer range of probabilities] rather than [no data].
It’s much easier to believe something and not investigate, than to investigate and try to deceive yourself. And unless you add as an axiom “unlike all other humans for me false beliefs are never beneficial” (which sounds like a severe case of irony), then a rationalist on occasion must be in favor of said false beliefs. Just out of curiosity, why the switch from “rational” to “epistemic”?
And you seriously believe that, in all circumstances and for all people with any false belief, those people are better off believing the truth concerning that belief? The obvious counterexample is the placebo effect, where a false belief is scientifically proven to have a benefit. The beneficial effects of false beliefs are so powerful, that you can’t conduct a pharmaceutical study without accounting for them. And you are no doubt familiar with that effect. Another example would be believing that you’re never better off believing a false belief, because then you have more incentive to investigate suspicious beliefs.
The difficult epistemic state to get into is justifiably believing that you’re better off believing falsely about something without already, in some sense, knowing the truth about it.
It’s actually very easy and common to believe that you’re better off believing X, whether or not X is true, without knowing the truth about it. This is also well-justified in decision theory, and by your definition of rationality, if believing X will help you win. A common example is choosing to believe that your date has an “average romantic history” and choosing not to investigate.
If you think you can’t do this, I propose this math problem. Using a random number generator over all American citizens, I have selected Bob (but not identified him to you). If you can guess Bob’s IQ (with margin of error +/- 5 points), you get a prize. Do you think it is possible for Bob’s IQ to be higher or lower than you expected, and if so, do you believe you’re better off not having any expectation at all rather than a potentially false expectation? See, as soon as a question is asked, you fill in the answer with [expected answer range of probabilities] rather than [no data].
It’s much easier to believe something and not investigate, than to investigate and try to deceive yourself. And unless you add as an axiom “unlike all other humans for me false beliefs are never beneficial” (which sounds like a severe case of irony), then a rationalist on occasion must be in favor of said false beliefs. Just out of curiosity, why the switch from “rational” to “epistemic”?