This is a good point, and holds in the majority of cases, although there are other considerations which should also be mentioned.
Since all maps are ‘flawed’ by definition, an important question is whether the flaws in your map actually interact with your goals, and if they do whether they are beneficial or harmful. It’s usually not a good use of your energy to fine tune areas of your map which don’t have any impact on your life and actively wasteful to “fix” them in ways which make it harder to achieve your goals.
Incorrect beliefs can be useful in the aggregate even if they fail in certain situations, as long as those situations are rare or inconsequential enough. I can be utterly wrong in my belief that there are no tigers in New York City (there are several in the Bronx Zoo, not to mention that more might well be kept illegally as pets) but it’s completely orthogonal to my daily life and thus not important enough to spend effort investigating. And if I had a pathological fear of tigers, I would gain a pretty significant advantage from that same false belief; I would do well to maintain it even if presented with genuine counter-evidence.
I think that most religions are wrong to harmful degrees, but it’s not an ironclad rule of rationality that beliefs must be maximally accurate. A pessimist is actually more accurate in their assessments of people, but optimists are happier and more successful; if your rationality insists you cannot be optimistic, then it is not useful and should be ignored.
This is a good point, and holds in the majority of cases, although there are other considerations which should also be mentioned.
Since all maps are ‘flawed’ by definition, an important question is whether the flaws in your map actually interact with your goals, and if they do whether they are beneficial or harmful. It’s usually not a good use of your energy to fine tune areas of your map which don’t have any impact on your life and actively wasteful to “fix” them in ways which make it harder to achieve your goals.
Incorrect beliefs can be useful in the aggregate even if they fail in certain situations, as long as those situations are rare or inconsequential enough. I can be utterly wrong in my belief that there are no tigers in New York City (there are several in the Bronx Zoo, not to mention that more might well be kept illegally as pets) but it’s completely orthogonal to my daily life and thus not important enough to spend effort investigating. And if I had a pathological fear of tigers, I would gain a pretty significant advantage from that same false belief; I would do well to maintain it even if presented with genuine counter-evidence.
I think that most religions are wrong to harmful degrees, but it’s not an ironclad rule of rationality that beliefs must be maximally accurate. A pessimist is actually more accurate in their assessments of people, but optimists are happier and more successful; if your rationality insists you cannot be optimistic, then it is not useful and should be ignored.