That makes more sense. Broadly, I agree with Jacobian here, but there are a few points I’d like to add.
First, it seems to me that there aren’t many situations in which this is actually the case. If you treat people decently (regardless of their religion or lack thereof), you are unlikely to lose friends for being atheist (especially if you don’t talk about it). Sure, don’t be a jerk and inappropriately impose your views on others, and don’t break it to your fundamentalist parents that you think religion is a sham. But situations where it would be instrumentally rational to believe falsely important things, the situations in which there really would be an expected net benefit even after factoring in the knock-on effects of making your epistemological slope just that bit more slippery, these situations seem constrained to “there’s an ASI who will torture me forever if I don’t consistently system-2 convince myself that god exists”. At worst, if you really can’t find other ways of socializing, keep going to church while internally keeping an accurate epistemology.
Second, I think you’re underestimating how quickly beliefs can grow their roots. For example, after reading Nate’s Dark Arts of Rationality, I made a carefully-weighed decision to adopt certain beliefs on a local level, even though I don’t believe them globally: “I can understand literallyanything if I put my mind to it for enough time”, “I work twice as well while wearing shoes”, “I work twice as well while not wearing shoes” (the internal dialogue for adopting this one was pretty amusing), etc. After creating the local “shoe” belief and intensely locally-believing it, I zoomed out and focused on labelling it as globally-false. I was met with harsh resistance from thoughts already springing up to rationalize why my shoes actually could make me work harder. I had only believed this ridiculous thing for a few seconds, and my subconscious was already rushing to its defense. For this reason, I decided against globally-believing anything I know to be false, even though it may be “instrumentally rational” for me to always study as if I believe AGI is a mere two decades away. I am not yet strong enough to do this safely.
Third, I think this point of view underestimates the knock-on effects I mentioned earlier. Once you’ve crossed that bright line, once “instrumental rationality let me be Christian” is established, what else is left? Where is the Schelling fence for beliefs? I don’t know, but I think it’s better to be safe than sorry—especially in light of 1) and 2).
That makes more sense. Broadly, I agree with Jacobian here, but there are a few points I’d like to add.
First, it seems to me that there aren’t many situations in which this is actually the case. If you treat people decently (regardless of their religion or lack thereof), you are unlikely to lose friends for being atheist (especially if you don’t talk about it). Sure, don’t be a jerk and inappropriately impose your views on others, and don’t break it to your fundamentalist parents that you think religion is a sham. But situations where it would be instrumentally rational to believe falsely important things, the situations in which there really would be an expected net benefit even after factoring in the knock-on effects of making your epistemological slope just that bit more slippery, these situations seem constrained to “there’s an ASI who will torture me forever if I don’t consistently system-2 convince myself that god exists”. At worst, if you really can’t find other ways of socializing, keep going to church while internally keeping an accurate epistemology.
Second, I think you’re underestimating how quickly beliefs can grow their roots. For example, after reading Nate’s Dark Arts of Rationality, I made a carefully-weighed decision to adopt certain beliefs on a local level, even though I don’t believe them globally: “I can understand literally anything if I put my mind to it for enough time”, “I work twice as well while wearing shoes”, “I work twice as well while not wearing shoes” (the internal dialogue for adopting this one was pretty amusing), etc. After creating the local “shoe” belief and intensely locally-believing it, I zoomed out and focused on labelling it as globally-false. I was met with harsh resistance from thoughts already springing up to rationalize why my shoes actually could make me work harder. I had only believed this ridiculous thing for a few seconds, and my subconscious was already rushing to its defense. For this reason, I decided against globally-believing anything I know to be false, even though it may be “instrumentally rational” for me to always study as if I believe AGI is a mere two decades away. I am not yet strong enough to do this safely.
Third, I think this point of view underestimates the knock-on effects I mentioned earlier. Once you’ve crossed that bright line, once “instrumental rationality let me be Christian” is established, what else is left? Where is the Schelling fence for beliefs? I don’t know, but I think it’s better to be safe than sorry—especially in light of 1) and 2).