It’s actually very easy and common to believe that you’re better off believing X, whether or not X is true, without knowing the truth about it. This is also well-justified in decision theory, and by your definition of rationality, if believing X will help you win. A common example is choosing to believe that your date has an “average romantic history” and choosing not to investigate.
If you think you can’t do this, I propose this math problem. Using a random number generator over all American citizens, I have selected Bob (but not identified him to you). If you can guess Bob’s IQ (with margin of error +/- 5 points), you get a prize. Do you think it is possible for Bob’s IQ to be higher or lower than you expected, and if so, do you believe you’re better off not having any expectation at all rather than a potentially false expectation? See, as soon as a question is asked, you fill in the answer with [expected answer range of probabilities] rather than [no data].
It’s much easier to believe something and not investigate, than to investigate and try to deceive yourself. And unless you add as an axiom “unlike all other humans for me false beliefs are never beneficial” (which sounds like a severe case of irony), then a rationalist on occasion must be in favor of said false beliefs. Just out of curiosity, why the switch from “rational” to “epistemic”?
It’s actually very easy and common to believe that you’re better off believing X, whether or not X is true, without knowing the truth about it. This is also well-justified in decision theory, and by your definition of rationality, if believing X will help you win. A common example is choosing to believe that your date has an “average romantic history” and choosing not to investigate.
If you think you can’t do this, I propose this math problem. Using a random number generator over all American citizens, I have selected Bob (but not identified him to you). If you can guess Bob’s IQ (with margin of error +/- 5 points), you get a prize. Do you think it is possible for Bob’s IQ to be higher or lower than you expected, and if so, do you believe you’re better off not having any expectation at all rather than a potentially false expectation? See, as soon as a question is asked, you fill in the answer with [expected answer range of probabilities] rather than [no data].
It’s much easier to believe something and not investigate, than to investigate and try to deceive yourself. And unless you add as an axiom “unlike all other humans for me false beliefs are never beneficial” (which sounds like a severe case of irony), then a rationalist on occasion must be in favor of said false beliefs. Just out of curiosity, why the switch from “rational” to “epistemic”?