Hello everyone, first post. My education level is Associate’s. My special skills include mathematics and reading comprehension.
I come to this website, because as I look at the rationalist techniques I can’t think to myself, “This is a skill that would be beneficial to learn.” I have done some preliminary reading of some of the posts here and find that while a lot of it is rather chewy (that is, taking extra time to process mentally), it is genuinely enjoyable to peruse and be made to think.
I have a question. Considering that I am religious, and I fully intend to stay that way, despite any evidence that might otherwise suggest to change that, how much are any rationalist skills that I may build up hampered? I don’t want to summon a religious discussion, so if it seems that I might be, please just think of it as Fixed Belief X. I understand that the ability to update beliefs is central to rationality, but one such belief doesn’t seem crushing.
I ask because I want to make sure that I am actually obtaining value out of my time. I don’t want to find some arbitrary time down the road that my efforts have borne no fruit, and it was impossible from the beginning.
Well, the meta-level of what you said is “Updating beliefs when evidence is against them is not always beneficial.” I think there are articles here that challenge this kind of meta although I cannot point to them, I am fairly new here. But I still see the issue namely how exactly do you decide, by what algorithm, what other beliefs of yours you want to update when evidence is against them and what not? So it seems you will have to competing motives, to execute the truth-seeking algorithm and the belief-defending one and they may weaken each other. Yet, I think with some compartmentalization it can work but it may be difficult.
To put it different words, you can simply put a taboo on full-on truth-seeking wrt religion and let the truth-seeking algorithm run elsewhere, but you have a reason, an algorithm for that taboo, maybe not fully conscious and that may conflict your truth-seeking algorithm in other fields in more subtle ways: perhaps not handing out an clear obvious taboo, but biasing results. Or to be blunt: non-rationality has reasons and methods too, and thus leaks out from compartments and contaminates.
Just my 2 cents, I am also a beginning learner here.
Depends on how much effect your religion has on you. I doubt you’ll be any less rational if you go to church every day although you may end up loathing it one day.
If anybody has a link to the post that Eliezer told a story about how he was told to “pray and (literally) stfu” you’ll have a good example of how religion can screw up reasoning. You can still reason effectively in religion irrelevant to how true it is, but you’re probably going to encounter something you’ll say “this doesn’t make sense” and you will one day encounter someone who WILL do something entirely paradoxial while wearing their chosen religious headwear.
To be fair, this kind of example is a bit extreme. I used to read edwardfeser.blogspot.com and he fails at being empiricist, but does not fail at logical reasoning. His only—albeit catastrophic—failure is “X follows from the premises we accepted to be true, hence reality works like X”. Map-terrain… However, even Feser could not make a useful ratonalist because of this failure at empiricism. Unwilling to step over the map-terrain gap, the language-reality gap.
Really, the primarily problem of Feser type smart theists is not that they cannot reason, it is that they believe too much in language. Theism almost follows from that failure mode, as language is a mind-product, so when they believe reality works so that that the arguments expressed in words, which tend to convince human minds also happen to be true out there in reality, almost assumes there is a human-like mind behind the universe. Proper atheism starts with the idea of accepting the universe does not give half a shit about our logic, reasoning and intellectuality and we can find ideas perfectly convincing and we can admit they are true and out there still they aren’t: but that is really hard as it means really throwing out much of our intellectual history and tradition. It is an incredible huge gap for a culture that got shaped by e.g. Plato to say—and we MUST say this—“Your ideas convinced me perfectly. They are still not true.”
No, it’s a great example of EVERYTHING (not just religion) going to shit because it basically says “don’t think, do”.
It’s not any less harmful even if we remove religion from there. It can apply to.. practically everything. I think it’s sound personal philosophy to know what the fuck you’re actually doing. Hell, it’s probably the first step in making a plan and it’s a step in every process of it.
Hello everyone, first post. My education level is Associate’s. My special skills include mathematics and reading comprehension.
I come to this website, because as I look at the rationalist techniques I can’t think to myself, “This is a skill that would be beneficial to learn.” I have done some preliminary reading of some of the posts here and find that while a lot of it is rather chewy (that is, taking extra time to process mentally), it is genuinely enjoyable to peruse and be made to think.
I have a question. Considering that I am religious, and I fully intend to stay that way, despite any evidence that might otherwise suggest to change that, how much are any rationalist skills that I may build up hampered? I don’t want to summon a religious discussion, so if it seems that I might be, please just think of it as Fixed Belief X. I understand that the ability to update beliefs is central to rationality, but one such belief doesn’t seem crushing.
I ask because I want to make sure that I am actually obtaining value out of my time. I don’t want to find some arbitrary time down the road that my efforts have borne no fruit, and it was impossible from the beginning.
Well, the meta-level of what you said is “Updating beliefs when evidence is against them is not always beneficial.” I think there are articles here that challenge this kind of meta although I cannot point to them, I am fairly new here. But I still see the issue namely how exactly do you decide, by what algorithm, what other beliefs of yours you want to update when evidence is against them and what not? So it seems you will have to competing motives, to execute the truth-seeking algorithm and the belief-defending one and they may weaken each other. Yet, I think with some compartmentalization it can work but it may be difficult.
To put it different words, you can simply put a taboo on full-on truth-seeking wrt religion and let the truth-seeking algorithm run elsewhere, but you have a reason, an algorithm for that taboo, maybe not fully conscious and that may conflict your truth-seeking algorithm in other fields in more subtle ways: perhaps not handing out an clear obvious taboo, but biasing results. Or to be blunt: non-rationality has reasons and methods too, and thus leaks out from compartments and contaminates.
Just my 2 cents, I am also a beginning learner here.
Depends on how much effect your religion has on you. I doubt you’ll be any less rational if you go to church every day although you may end up loathing it one day.
If anybody has a link to the post that Eliezer told a story about how he was told to “pray and (literally) stfu” you’ll have a good example of how religion can screw up reasoning. You can still reason effectively in religion irrelevant to how true it is, but you’re probably going to encounter something you’ll say “this doesn’t make sense” and you will one day encounter someone who WILL do something entirely paradoxial while wearing their chosen religious headwear.
To be fair, this kind of example is a bit extreme. I used to read edwardfeser.blogspot.com and he fails at being empiricist, but does not fail at logical reasoning. His only—albeit catastrophic—failure is “X follows from the premises we accepted to be true, hence reality works like X”. Map-terrain… However, even Feser could not make a useful ratonalist because of this failure at empiricism. Unwilling to step over the map-terrain gap, the language-reality gap.
Really, the primarily problem of Feser type smart theists is not that they cannot reason, it is that they believe too much in language. Theism almost follows from that failure mode, as language is a mind-product, so when they believe reality works so that that the arguments expressed in words, which tend to convince human minds also happen to be true out there in reality, almost assumes there is a human-like mind behind the universe. Proper atheism starts with the idea of accepting the universe does not give half a shit about our logic, reasoning and intellectuality and we can find ideas perfectly convincing and we can admit they are true and out there still they aren’t: but that is really hard as it means really throwing out much of our intellectual history and tradition. It is an incredible huge gap for a culture that got shaped by e.g. Plato to say—and we MUST say this—“Your ideas convinced me perfectly. They are still not true.”
No, it’s a great example of EVERYTHING (not just religion) going to shit because it basically says “don’t think, do”.
It’s not any less harmful even if we remove religion from there. It can apply to.. practically everything. I think it’s sound personal philosophy to know what the fuck you’re actually doing. Hell, it’s probably the first step in making a plan and it’s a step in every process of it.