Not sure how relevant can be my advice, because I was never in your position. I was never religious. I grew up in a communist country, which is kinda similar to growing up in a cult, but I wasn’t a true believer of that either.
My prediction is that in the process of your change, you will fail to update on some points, and overcompensate on other points. Which is okay, because growing up happens in multiple iterations. What you do wrong in the first step, you can fix in the second one. As long as you keep some basic humility and admit that you still may be wrong, even after you got rid of your previous wrong ideas. Your currect position is the next step in your personal evolution; it does not have to be the final step.
Here are some potential mistakes to avoid:
package fallacy: “either the Christianity I grew up in is 100% correct, or the rationalism as I understand it today is 100% correct”, or “either everything I believed in the past was 100% correct, or everything I believed in the past was 100% wrong”. Belief packages are collection of statements, some of them dependent on each other, but most of them are independent. There is nothing wrong with choosing A and rejecting B from package 1, and choosing X and rejecting Y from package 2. Each statement is true or false individually. You can apply this to religious beliefs, political beliefs, beliefs of rationalists, etc. (This does not imply the fallacy of grey; some packages contain more true statements than others. You can still sometimes find a gem in a 90% wrong package, though.)
losing your cool. What is true is already true; and it all adds up to normality. Don’t kill yourself after reading about quantum immortality, don’t freak out after reading about basilisk, don’t do anything crazy just because the latest article on LW or someone identifying as rationalist told you so. Don’t burn bridges. Do reductionism properly: after learning that the apple is actually composed of atoms, you can still eat it and enjoy its taste. Evolution is a fact, but the goals of evolution are not your goals (for example, evolution doesn’t give a fuck about your suffering).
identification and tribalism. “Rationalists” are a tribe; rationality is not. Rationality does not depend on what rationalists believe; the entire point of rationality is doing things the other way round: changing your beliefs to fit facts, not ignoring facts to belong better. What is true is true regardless of what rationalists believe.
There’s also a larger meta-issue here. I have a lifelong wholeness project of fighting perfectionism. It’s so ingrained in me that I’m pretty confident that fight will be lifelong for me. In that vein, this whole exercise could be seen as just another attempt to Do it Right The First Time™ and Never Make a Mistake®. So I do need to give myself a little freedom to screw this up, or I will really screw it up the way that I screwed up every relationship I never had before this. (Yes, I actually never dated anyone before this. I blame it on fear, shame & perfectionism + Evangelical sexual ethics taken a bit too far.)
Go one step more meta, and realize that perfectionism itself is imperfect (i.e. does not lead to optimal outcomes in life). Making conclusions before gathering data is a mistake. It is okay to do the right thing, as long as it is actually the right thing instead of something that merely feels right (such as following the perfectionist rituals even when they lead to suboptimal outcomes). Relax (relaxation improves your life, how dare you ignore that).
Copying your partner’s opinions feels wrong, but hey, what can I do here? Offer you my opinion to copy instead? Heh.
If it’s an issue that I don’t have strong priors on and is not likely to significantly influence any major decisions I make with regard to her, I might as well just go with the flow and not complicate things unnecessarily.
You might also adopt the position “I don’t know”. It is a valid position if you really don’t know. Also, the point of having opinions is that they may influence your decitions. If something is too abstract to have an impact on anything, ignoring it may be the smart choice.
Thanks for your thoughts; they’re all good ones! I’ve actually already engaged with the Rationality literature enough to have encountered most of them (I’m about 2⁄3 through The Sequences at the moment).
I think after reading people’s responses to this post, I realize that the scenario I outline here is even less likely than I originally thought. There are wrong ways to apply rationality, it’s true. But those are the failure modes @LeBleu alluded to. For everyone else, Rationality isn’t a destination, it’s a path. The updating is continuous. What happened for me is that I came from a different epistemological tradition and jumped ship to this one. Bushwhacking across terrain to get from one path to another is no fun. But now that I’m on a path, I’m not going to get into that kind of trouble again unless I leave the Rationality path entirely. So then the only question I need to be this worried about is whether the Rationality path is correct, and I’m pretty well convinced of that… but still willing to update, I suppose.
Go one step more meta, and realize that perfectionism itself is imperfect
The point about perfectionism is a good one. I’ve already recognized that perfectionism is not rational though, and it’s more of a compulsive behavior / default mental state to inherently assume that information is free and be down on myself for not already knowing it and executing perfectly on it. Perhaps I actually can fully overcome that, but I’m not expecting it (which would be the perfectionist thing to do anyway ;)
Welcome!
Not sure how relevant can be my advice, because I was never in your position. I was never religious. I grew up in a communist country, which is kinda similar to growing up in a cult, but I wasn’t a true believer of that either.
My prediction is that in the process of your change, you will fail to update on some points, and overcompensate on other points. Which is okay, because growing up happens in multiple iterations. What you do wrong in the first step, you can fix in the second one. As long as you keep some basic humility and admit that you still may be wrong, even after you got rid of your previous wrong ideas. Your currect position is the next step in your personal evolution; it does not have to be the final step.
Here are some potential mistakes to avoid:
package fallacy: “either the Christianity I grew up in is 100% correct, or the rationalism as I understand it today is 100% correct”, or “either everything I believed in the past was 100% correct, or everything I believed in the past was 100% wrong”. Belief packages are collection of statements, some of them dependent on each other, but most of them are independent. There is nothing wrong with choosing A and rejecting B from package 1, and choosing X and rejecting Y from package 2. Each statement is true or false individually. You can apply this to religious beliefs, political beliefs, beliefs of rationalists, etc. (This does not imply the fallacy of grey; some packages contain more true statements than others. You can still sometimes find a gem in a 90% wrong package, though.)
losing your cool. What is true is already true; and it all adds up to normality. Don’t kill yourself after reading about quantum immortality, don’t freak out after reading about basilisk, don’t do anything crazy just because the latest article on LW or someone identifying as rationalist told you so. Don’t burn bridges. Do reductionism properly: after learning that the apple is actually composed of atoms, you can still eat it and enjoy its taste. Evolution is a fact, but the goals of evolution are not your goals (for example, evolution doesn’t give a fuck about your suffering).
identification and tribalism. “Rationalists” are a tribe; rationality is not. Rationality does not depend on what rationalists believe; the entire point of rationality is doing things the other way round: changing your beliefs to fit facts, not ignoring facts to belong better. What is true is true regardless of what rationalists believe.
Go one step more meta, and realize that perfectionism itself is imperfect (i.e. does not lead to optimal outcomes in life). Making conclusions before gathering data is a mistake. It is okay to do the right thing, as long as it is actually the right thing instead of something that merely feels right (such as following the perfectionist rituals even when they lead to suboptimal outcomes). Relax (relaxation improves your life, how dare you ignore that).
Copying your partner’s opinions feels wrong, but hey, what can I do here? Offer you my opinion to copy instead? Heh.
You might also adopt the position “I don’t know”. It is a valid position if you really don’t know. Also, the point of having opinions is that they may influence your decitions. If something is too abstract to have an impact on anything, ignoring it may be the smart choice.
Thanks for your thoughts; they’re all good ones! I’ve actually already engaged with the Rationality literature enough to have encountered most of them (I’m about 2⁄3 through The Sequences at the moment).
I think after reading people’s responses to this post, I realize that the scenario I outline here is even less likely than I originally thought. There are wrong ways to apply rationality, it’s true. But those are the failure modes @LeBleu alluded to. For everyone else, Rationality isn’t a destination, it’s a path. The updating is continuous. What happened for me is that I came from a different epistemological tradition and jumped ship to this one. Bushwhacking across terrain to get from one path to another is no fun. But now that I’m on a path, I’m not going to get into that kind of trouble again unless I leave the Rationality path entirely. So then the only question I need to be this worried about is whether the Rationality path is correct, and I’m pretty well convinced of that… but still willing to update, I suppose.
The point about perfectionism is a good one. I’ve already recognized that perfectionism is not rational though, and it’s more of a compulsive behavior / default mental state to inherently assume that information is free and be down on myself for not already knowing it and executing perfectly on it. Perhaps I actually can fully overcome that, but I’m not expecting it (which would be the perfectionist thing to do anyway ;)