I think your personal beliefs do matter. From my perspective, there is a big difference between “I believe that Jesus Christ lived on Earth and died for my sins and God really listens to my prayers”, “I believe that some entity exists in the universe with power greater than we can imagine”, “the entire universe is God” or “God is love.”
I’d add that how much rationality I ascribe to someone with a particular religious outlook has quite little correlation with our agreement on object-level beliefs. That is, I find a dogmatic Calvinist to be more likely to think rationally than a person with some vague hodgepodge of beliefs, although the latter will be more likely to agree with me on evolution and on social issues, because the former’s beliefs are (to some extent) optimized for consistency while the latter’s are generally not.
Are you saying that the difference between your examples is enough to include me or exclude me from LessWrong? Or is the difference in how you in particular relate to me here? What actions revolve around the differences you see in those examples?
I don’t think we would exclude someone solely on the basis of belief, as one of the goals here is to educate.
I’m not sure there is much action involved, but people might treat you differently if you admitted to being an evangelical Christian compared to being a believer because you are uncomfortable giving into the nihilism of non-belief
Edit: After rereading your post, yes, there are rational religious people. I have a few friends of the type, and I think the most important part of being a rational religious person is admitting that belief is irrational, steeped in feelings of culture or helplessness rather than convincing evidence. It’s a slippery slope though, if you keep thinking about it you may find it hard to hold onto your belief.
Maybe in a few days you should make a top-level post about your beliefs and we can try to examine the reasons why you believe the way you do, and try and understand why you are comfortable with conflicting beliefs. No pitchforks, I promise, you seem to know the linguistic patterns to use here so that no one will pounce on you.
It’s a slippery slope though, if you keep thinking about it you may find it hard to hold onto your belief.
If I cannot hold onto a belief it isn’t worth holding on to.
Maybe in a few days you should make a top-level post about your beliefs and we can try to examine the reasons why you believe the way you do, and try and understand why you are comfortable with conflicting beliefs. No pitchforks, I promise, you seem to know the linguistic patterns to use here so that no one will pounce on you.
My current plan is to inch into the heavy topics with a few basic posts about belief, doubt, and self-delusion. But I know some of these things are discussed elsewhere because I remember someone at OB talking about the plausibility of self-delusion.
In any case, I am still working through the Sequences. I expect a lot of my questions are answered there.
I agree with Kevin that belief is insufficient for exclusion/rejection. Best I can tell, it’s not so much what you believe that matters here as what you say and do: if you sincerely seek to improve yourself and make this clear without hostility, you will be accepted no matter the gap (as you have found with this post and previous comments).
The difference between the beliefs Kevin cited lies in the effect they may have on the perspective from which you can contribute ideas. Jefferson’s deism had essentially no effect on his political and moral philosophizing (at least, his work could easily have been produced by an atheist). Pat Robertson’s religiosity has a great deal of effect on what he says and does, and that would cause a problem.
The fact that you wrote this post suggests you are in the former category, and I for one am glad you’re here.
Best I can tell, it’s not so much what you believe that matters here as what you say and do
I agree with the rest of your comment, but this seems very wrong to me. I’d say rather that the unity we (should) look for on LW is usually more meta-level than object-level, more about pursuing correct processes of changing belief than about holding the right conclusions. Object-level understanding, if not agreement, will usually emerge on its own if the meta-level is in good shape.
I think your personal beliefs do matter. From my perspective, there is a big difference between “I believe that Jesus Christ lived on Earth and died for my sins and God really listens to my prayers”, “I believe that some entity exists in the universe with power greater than we can imagine”, “the entire universe is God” or “God is love.”
I’d add that how much rationality I ascribe to someone with a particular religious outlook has quite little correlation with our agreement on object-level beliefs. That is, I find a dogmatic Calvinist to be more likely to think rationally than a person with some vague hodgepodge of beliefs, although the latter will be more likely to agree with me on evolution and on social issues, because the former’s beliefs are (to some extent) optimized for consistency while the latter’s are generally not.
Are you saying that the difference between your examples is enough to include me or exclude me from LessWrong? Or is the difference in how you in particular relate to me here? What actions revolve around the differences you see in those examples?
I don’t think we would exclude someone solely on the basis of belief, as one of the goals here is to educate.
I’m not sure there is much action involved, but people might treat you differently if you admitted to being an evangelical Christian compared to being a believer because you are uncomfortable giving into the nihilism of non-belief
Edit: After rereading your post, yes, there are rational religious people. I have a few friends of the type, and I think the most important part of being a rational religious person is admitting that belief is irrational, steeped in feelings of culture or helplessness rather than convincing evidence. It’s a slippery slope though, if you keep thinking about it you may find it hard to hold onto your belief.
Maybe in a few days you should make a top-level post about your beliefs and we can try to examine the reasons why you believe the way you do, and try and understand why you are comfortable with conflicting beliefs. No pitchforks, I promise, you seem to know the linguistic patterns to use here so that no one will pounce on you.
If I cannot hold onto a belief it isn’t worth holding on to.
My current plan is to inch into the heavy topics with a few basic posts about belief, doubt, and self-delusion. But I know some of these things are discussed elsewhere because I remember someone at OB talking about the plausibility of self-delusion.
In any case, I am still working through the Sequences. I expect a lot of my questions are answered there.
I agree with Kevin that belief is insufficient for exclusion/rejection. Best I can tell, it’s not so much what you believe that matters here as what you say and do: if you sincerely seek to improve yourself and make this clear without hostility, you will be accepted no matter the gap (as you have found with this post and previous comments).
The difference between the beliefs Kevin cited lies in the effect they may have on the perspective from which you can contribute ideas. Jefferson’s deism had essentially no effect on his political and moral philosophizing (at least, his work could easily have been produced by an atheist). Pat Robertson’s religiosity has a great deal of effect on what he says and does, and that would cause a problem.
The fact that you wrote this post suggests you are in the former category, and I for one am glad you’re here.
I agree with the rest of your comment, but this seems very wrong to me. I’d say rather that the unity we (should) look for on LW is usually more meta-level than object-level, more about pursuing correct processes of changing belief than about holding the right conclusions. Object-level understanding, if not agreement, will usually emerge on its own if the meta-level is in good shape.
Indeed, I agree—I meant that it doesn’t matter what conclusions you hold as much as how you interact with people as you search for them.