This is a subject that I always thought was underrepresented on OB. I’m glad to see it coming up on LW.
The need to budget limited mental resources provides another way in which rational people can hold beliefs that seem obviously irrational: a religious or political opinion may be deeply ingrained (therefore, hard to change without a lot of effort even if it’s very wrong) and have few consequences in everyday life (therefore, not pay off much for rethinking it even if it’s very wrong), so in principle at least someone could be rational in not reconsidering such beliefs even in the face of much troubling counter-evidence. I wonder how much that actually happens; most people think, or at least think they think, or at least say they think, that their religious and political opinions are important and consequential.
Deciding how much thinking to allocate to any given question is, itself, something that takes up time and mental effort. Perhaps we should be budgeting for that too. (There would be an infinite regress here, but in practice we’re generally happy to truncate it rather early and just do what feels best. This is a Good Thing, though we should probably watch ourselves for a while every now and then to see whether we’re choosing sensible truncation points.)
a religious or political opinion may be deeply ingrained (therefore, hard to change without a lot of effort even if it’s very wrong) and have few consequences in everyday life (therefore, not pay off much for rethinking it even if it’s very wrong)
I think this is a very insightful point. It illustrates a situation that in order to be rational (win more, spending fewer resources), you need to be irrational (your map does not reflect the territory).
I wish I could “save” comments in the same way I can save posts, because I’ll want to re-read yours many times over the next few months.
[I]n principle at least someone could be rational in not reconsidering [irrational] beliefs even in the face of much troubling counter-evidence. I wonder how much that actually happens; most people think, or at least think they think, or at least say they think, that their religious and political opinions are important and consequential.
The other side of this would be people who hold beliefs they fear may be irrational but really do not want to think about it and convince themselves that it does not matter; they would probably act the same anyway.
This is a subject that I always thought was underrepresented on OB. I’m glad to see it coming up on LW.
The need to budget limited mental resources provides another way in which rational people can hold beliefs that seem obviously irrational: a religious or political opinion may be deeply ingrained (therefore, hard to change without a lot of effort even if it’s very wrong) and have few consequences in everyday life (therefore, not pay off much for rethinking it even if it’s very wrong), so in principle at least someone could be rational in not reconsidering such beliefs even in the face of much troubling counter-evidence. I wonder how much that actually happens; most people think, or at least think they think, or at least say they think, that their religious and political opinions are important and consequential.
Deciding how much thinking to allocate to any given question is, itself, something that takes up time and mental effort. Perhaps we should be budgeting for that too. (There would be an infinite regress here, but in practice we’re generally happy to truncate it rather early and just do what feels best. This is a Good Thing, though we should probably watch ourselves for a while every now and then to see whether we’re choosing sensible truncation points.)
I think this is a very insightful point. It illustrates a situation that in order to be rational (win more, spending fewer resources), you need to be irrational (your map does not reflect the territory).
I wish I could “save” comments in the same way I can save posts, because I’ll want to re-read yours many times over the next few months.
The other side of this would be people who hold beliefs they fear may be irrational but really do not want to think about it and convince themselves that it does not matter; they would probably act the same anyway.
Since I tried to argue something similar, but didn’t argue it nearly so well, I hope you don’t mind that I linked your comment in my post here.