Reading this, I wonder why a LDS missionary got interested in a rationalist community which is generally hostile to religion. I would appreciate some explanation about the author’s motivations.
Because there are things I can learn here. I can handle the hostility to religion. But if you don’t cross-pollinate, you become a hick.
It doesn’t seem to me to be possible to hold both rationality and religion in one’s head at the same time without compartmentalization, which is one of the things rationality seeks to destroy.
I can actually quite easily accept that it could be a good idea for rationalists to adopt some of the community-building practices of religious groups, but I also think that rationality is necessarily corrosive to religion.
If you’ve squared that circle, I’d be interested to hear how. Being somewhat religious for the social bit but having excised the supernaturalism is the only stable state I can think of.
compartmentalization, which is one of the things rationality seeks to destroy
Yes, in the ideal limit, rationalists don’t compartmentalize. But decompartmentalizing too early, before you learn the skill of sanely resolving conflicts between beliefs and values in different compartments (rather than, for example, going with the one you feel more strongly), is a way to find the deepest, darkest crevices in the Valley of Rationality.
I also think that rationality is necessarily corrosive to religion.
Simple question, but what exactly is meant by “religion” when you say “there is a level of rationality at which religious beliefs become impossible”? I’ve been wondering about this for a while, and find it unclear whether my spiritual side is simply “not actually religion” or if there’s just some huge chunk of rationality that I’m missing. Thus far, the two have felt entirely compatible for me.
I’ve been wondering about this for a while, and find it unclear whether my spiritual side is simply “not actually religion”.
I wonder if you could clarify for me what you mean by “spiritual” in “spiritual side”? I was raised as a Roman Catholic, and to me ‘spiritual’ means the other side of Descartes’s dualism—the non-physical side. So, for example, I learned that the Deity and angels are purely spiritual. But being human, my spiritual side is my immortal soul—which pretty much includes my mind.
I’m pretty sure you (and millions of other people who talk about spirituality) mean something different from this, but I have never been able to figure out what you all mean.
A definition of ‘spiritual’ is preferred, but failing that, could you taboo ‘spiritual’ and say what you meant by ‘spiritual side’ without using the word?
More or less, it’s schizophrenic/delusional episodes, with an awareness that this is in fact what they are. Mostly what I use ‘spiritual’ to refer to is that, during these episodes, I tend to pick up a strong sense of ‘purpose’ - high level goals end up developed. I have no clue how I develop these top-level goals, and I’ve never found a way to do it via rationality. Rationality can help me mediate conflicts between goals, conflicts between goals and reality, and help me achieve goals, but it doesn’t seem able to set those top-level priorities.
About the closest I’ve come to doing it rationally is to realise that I’m craving purpose, and do various activities that tend to induce this state. Guided meditation is ideal, since it seems to produce more ‘productive’ episodes. It varies heavily whether I will get any particularly useful purpose out of one of these episodes; many episodes are drifting and purposeless, and others result in either impossible goals or ‘applause light’ goals that have no actual substance attached.
Ostensibly I could try to infer my goals from my emotional preferences, which I’ve been slowly working on as an alternative. Being bi-polar and having a number of other neurological instabilities makes it very difficult to get any sort of coherent mapping there, beyond very basic elements like ‘will to live’. Even those basics can be unstable: For about a year I had no real preference on my own survival due to a particularly bad schizophrenic episode.
I’d actually be rather curious how others handle the formation of top-level goals :)
I do also notice certain skills that I’m much more adept at when I’m having such an episode. I’ve observed this empirically, and can come up with rational explanations for it. I’m pretty certain the same results could be replicated rationally, either by studying the skills or by figuring out what I’m doing different during the schizophrenic episodes. I don’t feel that ‘spiritual’ is necessarily a good label for this aspect; “intuition” or simply “changing my perceptual lens on reality” seem more accurate. I mention it here simply because it happens to stem from the same source (schizophrenic episodes)
I’d actually be rather curious how others handle the formation of top-level goals :)
I find I have very little emotion attached to my highest-level goals. I’m not sure but I think I derive them by abstracting from my lower-level goals, which are based more on habit and emotion, and from ideas I absorb from books, etc. I then use them to try and make my lower-level goals less contradictory.
FWIW, I typically use the term in a secular sense to refer to those with interests in items from this list:
meditation, religious experiences, drugs, altered states, yoga, chanting, buddhism, taoism, other eastern mysticism, martial arts and self-improvement.
One reason amongst many: inasmuch as your religion includes unquestionable dogma, it is anathema to rationality. (It is for this reason that being a philosopher, I am non-religious for methodological reasons; dead dogma is not allowed). Having a belief that you cannot question is effectively giving it a probability of 1, which will distort the rest of your Bayesian network in terrible ways. See Infinite Certainty.
Having been raised Unitarian Universalist, I always find it very odd that “religion” is conflated with “unquestionable dogma”. I don’t think Unitarians have that any more than LessWrong does.
That said, if “religion” is being used as a shorthand for “unquestionable dogma”, then the comments about religion make significantly more sense :)
Having been raised Unitarian Universalist, I always find it very odd that “religion” is conflated with “unquestionable dogma”. I don’t think Unitarians have that any more than LessWrong does.
I highly doubt that. For one, a glance at a typical Unitarian web page will show a comprehensive and consistent list of left-wing ideological positions. Are you really claiming that if one were to express deep disagreement with those among the Unitarians, the reactions would be equally dispassionate, upfront, and open to argument as they usually are when the prevailing opinion is challenged on LW? (Not that LW is perfect in this regard either, but compared to nearly any other place, credit must be given where it’s due.)
Of course, some would claim that old-fashioned religious dogma is somehow incomparably worse and more irrational than modern ideological dogma, so much that the UU stuff doesn’t even deserve that designation. However, I don’t think this position is defensible, unless we insist on a rather tortured definition of “dogma.”
Are you really claiming that if one were to express deep disagreement with those among the Unitarians, the reactions would be equally dispassionate, upfront, and open to argument as they usually are when the prevailing opinion is challenged on LW?
The more I think about it, the more I find it difficult to answer this question. The main obstacle I’m running up against is that the two have very different communication styles, so the answer varies heavily depending on which communication style you’re seeking.
In my experiences, LessWrong is a very blunt, geeky approach to communication. It is also post-based, and thus neither real-time nor face-to-face. It’s very good at problem solving and science. People are likely to try and refute my stance, or treat it as a factual matter to be empirically tested.
Unitarian Universalist churches, by contrast, have been very polite and mainstream in their approach to communication. It’s also in-person, and real-time interaction. They’re very good at making people feel welcome and accepted. People are likely to simply accept that I happen to believe differently than them. People are likely to treat strong assertions as an article of faith, and therefore not particularly worth challenging.
I can’t really find a way to translate between these two, so I can’t really compare them.
Viewed through a mainstream, polite filter, I see LessWrong as a place that is actively hateful of religion, and extremely intolerant towards it, to the point of being willing to reject perfectly useful ideas simply because they happen to come from a religious organization.
Viewed through the blunt, geeky filter, I see UUs as blindly accepting and unwilling to actually challenge and dig in to an idea; I feel like I can have a very interesting discussion, but in many respects I’m a lot less likely to change someone’s mind (although, in other respects, I’d have a lot more luck using Dark Arts to manipulate a church-goer)
*justice, equity and compassion in human relations;
*world peace, liberty and justice for all; and
*respect for the interdependent web of all existence.
*Acceptance of one another and encouragement to spiritual growth in our congregations;
*a free and responsible search for truth and meaning; and
*the right of conscience and the use of the democratic process within our congregation and in society at large.
I would consider the first four to be values that are roughly shared with LessWrong, although there are definitely some differences in perspective. The fifth one, UUs focus on spiritual growth, LW focuses on growing rationality. The sixth principle is again shared. The seventh seems implemented in the LessWrong karma system, and I’d actually say LW does better here than the UUs.
It’s also worth noting that these are explicitly “shared values”, and not a creed. The general attitude I have seen is that one should show respect and tolerance even to people who don’t share these values.
LessWrong is a place for rationalists to meet and discuss rationality. UU Churches are a place for UUs to meet and discuss their shared values. It doesn’t serve LessWrong to have it dominated by “religion vs rationality” posts, nor posts trying to sell Christianity or de-convert rationalists. It doesn’t serve the UUs to have church dominated by challenges to those values.
This is a list of applause lights, not a statement of concrete values, beliefs, and goals. To find out the real UU values, beliefs, and goals, one must ask what exact arrangements constitute “liberty,” “justice,” etc., and what exact practical actions will, according to them, further these goals in practice. On these questions, there is nothing like consensus on LW, whereas judging by the uniformity of ideological positions espoused on the Unitarian/UU websites, there does seem to be a strong and apparently unchallenged consensus among them.
(To be precise, the applause lights list does include a few not completely vague goals, like e.g. “world peace,” but again, this says next to nothing without a clear position on what is likely to advance peace in practice and what to do when trade-offs are involved. There also seems to be one concrete political position on the list, namely democratism. However, judging by the responses seen when democracy is questioned on LW, there doesn’t seem to be a LW consensus on that either, and at any rate, even the notion of “democracy” is rather vague and weasely. I’m sure that the UU folks would be horrified by many things that have, or have historically had, firm democratic support in various places.)
judging by the uniformity of ideological positions espoused on the Unitarian/UU websites, there does seem to be a strong and apparently unchallenged consensus among them.
The core theme I’ve seen repeated across congregations is the “seven core principles” that I posted above. I’ve seen some degree of ideological consistency across those, but I’ve attended seen quite a few sermons discussing various perspectives on the seven core principles. It seems like a fairly common tradition to even invite speakers from other religions or affiliations to come and share their own thoughts.
Certainly a bias towards those who are “compatible” with the group consensus, and there is some degree of “group think”. LessWrong has this going for it as well, though: there’s a strong thread of anti-religion bias, and I’d say there’s a moderate pro-cryonics/singularity bias. I don’t see a lot of posts about how SIAI is a waste of time and money, or how Christianity is really misunderstood and we should come to embrace our Lord and Saviour, Jesus Christ.
Can you can point to something specific in the UU literature that makes you feel that they’re less tolerant to dissent than LessWrong?
Can you can point to something specific in the UU literature that makes you feel that they’re less tolerant to dissent than LessWrong?
Before I even click at a link to a Unitarian Universalist website, I know with very high probability that there is going to be a “social justice” section espousing ideological positions on a number of issues. And for any such section, I can predict with almost full certainty what precisely these positions will be before I even read any of it.
Now, the UU folks would probably claim that such agreement exists simply because these positions are correct. However, even if I agreed that all these positions are correct, given the public controversy over many of these issues, it would still seem highly implausible that such ideological uniformity could be maintained in practice in a group highly tolerant of dissent. In contrast, I see nothing comparable on LW.
You say:
LessWrong has this going for it as well, though: there’s a strong thread of anti-religion bias, and I’d say there’s a moderate pro-cryonics/singularity bias. I don’t see a lot of posts about how SIAI is a waste of time and money, or how Christianity is really misunderstood and we should come to embrace our Lord and Saviour, Jesus Christ.
Actually, in my opinion, LW does have its collective quirks and blind spots, but you’re nowhere close to pinpointing them.
Regarding SIAI being a waste of time and money, I’ve seen such opinions raised in several threads without getting downvoted or otherwise creating any drama. (I can dig up some links if you insist.) As long as you make a polite and coherent argument, you won’t elicit any hostility by criticizing SIAI.
Regarding religious proselytism, that is generally considered impolite anywhere. On the other hand, I actually do believe that there is a lot of misunderstanding of religion on LW, in the sense of many people having a “reversed stupidity” attitude towards various religious teachings and beliefs, developing “applause lights” reactions to various loudmouth atheists who bash traditional religion but believe far crazier stuff instead, etc., etc. I have made arguments along these lines on occasions, and I’ve never encountered any hostility in response, just reasonable counterarguments.
Regarding cryonics, it may well be that the average opinion on LW is heavily biased in favor of it. But again, if you want to argue that cryonics is bunk, you’ll be welcome to do so as long as you have something new, intelligent, and well-informed to say about it. (In fact, I remember posts from people who solicited for anti-cryonics arguments.)
In contrast to these topics, one that usually destroys the quality of discourse on LW are gender issues. This really is a recurring problem, but then, I seriously doubt that a diversity of views on these issues is welcome among UUs. Another problem are certain topics whose understanding requires familiarity with some peculiar theories that are discussed on LW occasionally, where certain (seemingly) very theoretical and far-fetched speculations are apparently taken seriously enough by some of the prominent people here that discussing them can lead to bizarre drama. None of this however comes anywhere close to the ideological uniformity that I observe among the Unitarian Universalists, at least judging from their internet presence.
Before I even click at a link to a Unitarian Universalist website, I know with very high probability that there is going to be a “social justice” section espousing ideological positions on a number of issues.
I suppose I should reiterate this, as it seems to be unclear: My point was not that UUs don’t have a degree of “group consensus.” My point was that they do not treat it as an unquestionable dogma.
That they generally have a “social values” page does not seem at all contradictory to this—the issue is whether they’re willing to entertain discussion from opposing views.
In my (anecdotal) experience as someone who has actually attended UU churches, the answer has been very strongly yes. If you have actual experiences to the contrary, or have seen websites from them that seem to make it vividly clear that dissent is not tolerated, I’d be genuinely curious to see this. It’s entirely possible that my experiences aren’t typical, but I haven’t seen any evidence to support that theory.
Tangentially: The discussion of actual issues and biases on LessWrong is appreciated. I’ve only been here briefly, so I haven’t really gotten to know the community that well yet.
This was sadly not clear in my original post, but my goal was to compare “looking at a public website” to “reading top-level posts”. I’ve never seen a top-level post supporting Christianity or condemning the SIAI here. On an individual level, I’m sure there are people that hold those stances, just as there are individual UU members who don’t agree with the values you’re seeing on the UU websites.
My point was simply “when you look at the ‘public face’ of an organisation, you’re going to see some degree of consensus, because that’s just how human organisations work”
The worldwide rationalist community has, for more than a century now, come to the conclusion that there is almost certainly no God. We consider the non-existence of God as usually defined (i.e. a sentient being who created the universe with intent, is still active in the universe, is omnipotent, omniscient, and omnibenevolent, and hears and sometimes answers prayers), to be so conclusively proven that there is usually no further need to discuss it.
...
We have a general community policy of not pretending to be open-minded on long-settled issues for the sake of not offending people. If we spent our time debating the basics, we would never get to the advanced stuff at all.
You don’t see a lot of posts about how gravity doesn’t really exist and it’s just the Flying Spaghetti Monster pushing us down with his tentacles, either.
Note the previous part of the sentence by Vladimir_M that you quoted: (emphasis added)
On these questions, there is nothing like consensus on LW
There’s a difference between consensus on empirical questions where the evidence falls overwhelmingly on one side, and consensus on higher-level ideological questions with a much less clear distribution of both evidence and arguments.
“You don’t see a lot of posts about how gravity doesn’t really exist and it’s just the Flying Spaghetti Monster pushing us down with his tentacles, either.”
And my original post:
I always find it very odd that “religion” is conflated with “unquestionable dogma”. I don’t think Unitarians have that any more than LessWrong does.
I’m not sure how pointing out that LessWrong explicitly has unquestionable dogma disproves my point.… That LessWrong’s dogma is primarily about scientific/empirical/factual matters is simply a function of it’s focus: LessWrong is about that sort of thing, whereas Unitarian Universalism is about social justice, community, and spirituality.
So, when you put it that way, I’d actually say the UUs have vastly less questionable dogma.
I’m not sure how pointing out that LessWrong explicitly has unquestionable dogma disproves my point....
Nope. There’s a big difference between “settled issues where questioning is a waste of time and effort” and “arbitrary positions where questioning is declared heretical by some authority (either a person or social mores).”
LessWrong is about that sort of thing, whereas Unitarian Universalism is about social justice, community, and spirituality.
Well, yes. You’re defining this yourself: LessWrong is about “settled issues” of science, and therefore it’s okay to dismiss debate as a “waste of time and effort”. Unitarian Universalists are about significantly more arbitrary positions, and therefore there’s a lot more room for discussion, because people have different starting assumptions and/or goals.
Nope. There’s a big difference between “settled issues where questioning is a waste of time and effort” and “arbitrary positions where questioning is declared heretical by some authority (either a person or social mores).”
Science does have the advantage that, more or less, everyone is willing to accept the same starting assumptions. Social justice and morality do not run in to that.
If you take the starting assumptions of the UUs as a given, then most of their stances are settled issues where questioning is a waste of time and effort. You can still have some really interesting discussions on corner cases and implementations, since the world is very chaotic and no one has yet managed to arrange a control group for controlled study :)
Of course, the UU stated stances are still fairly vague, so even within those, there’s the question of whether violence is ever okay, etc.
All this really boils down to the question:
“arbitrary positions where questioning is declared heretical by some authority (either a person or social mores).”
What evidence, exactly, do you have that Unitarian Universalists declare things ‘heretical’ significantly more often than LessWrong does?
Well, yes. You’re defining this yourself: LessWrong is about “settled issues” of science, and therefore it’s okay to dismiss debate as a “waste of time and effort”. Unitarian Universalists are about significantly more arbitrary positions, and therefore there’s a lot more room for discussion, because people have different starting assumptions and/or goals.
No, Less Wrong isn’t about settled issues, but they do come up fairly often in the course of relevant discussions. Separate magisteria arguments fail because they imply that consensus can be found based on different standards of evidence for different areas of discussion. Every area needs to be held to the same standard.
If you take the starting assumptions of the UUs as a given, then most of their stances are settled issues where questioning is a waste of time and effort. You can still have some really interesting discussions on corner cases and implementations, since the world is very chaotic and no one has yet managed to arrange a control group for controlled study :)
I’m not sure what the UU starting assumptions are. However, it seems unlikely that they are only terminal values, so standards of evidence should apply.
What evidence, exactly, do you have that Unitarian Universalists declare things ‘heretical’ significantly more often than LessWrong does?
The point of the first post that I made in this chain is that coming to a consensus based on overwhelming evidence is not the same as declaring something heretical.
You seem to be pursuing two lines of argument. In some places you’re just asserting that UU does not have dogmatic elements, in contradiction to Vladimir_M’s observations. That’s a separate conversation, and not really my concern.
In other places, though, you’re asserting that LW does have dogmatic elements. I have two problems with this. First, it’s not accurate, as I’ve explained. Second, taking the two lines of argument together, it sounds like you’re saying “UU doesn’t have dogma… and anyway, LW does too!” The two clearly aren’t consistent, so which is it?
Just to be clear, my main point is that LW doesn’t have dogma or declare things heretical, not that UU does (although I think it might approach those things in some areas). For that point, I’m providing examples and descriptions of the difference between consensus based on overwhelming evidence and arbitrary dogma. Dogma is arbitrarily absolute; it’s something to be questioned under no circumstances. Consensus based on evidence is a matter of Bayesian updating.
The two clearly aren’t consistent, so which is it?
Different definitions of dogma. The easiest translation would be “based on this usage of the word dogma, neither the UUs nor LW have it. Based on this other usage of the word dogma, both the UUs and LW seem to have it about equally. I can’t see any evidence that either definition results in the UUs having more dogma, and I can’t think of a third definition that makes sense, so I’m not sure why you’re insisting that the UUs are more dogmatic”.
English sucks for handling different definitions of the same word, and my brain does a wonderful job of not noticing when I’ve done this ^^;
Just to be clear, my main point is that LW doesn’t have dogma or declare things heretical, not that UU does
Ahh, okay. Then I think we’re actually on the same page. I was reading your “arbitrary absolutes” as being a reference to the UUs specifically. This makes much more sense now :)
An unchallenged consensus on positions of social policy, which are complicated and generally do not have conclusive evidence on one side of an argument, indicates the existence of some reinforcing social mores.
Edit: the comment at which this reply was directed was significantly altered after I typed this reply. Please hold on while I attempt to catch up.
I think we might have ended up off-track, so let me try to sum up my stance:
1) Unitarian Universalists, by default, must have “arbitrary positions” because they are not discussing settled matters. Therefore, the fact that they have arbitrary positions in and of itself is simply a function of their focus; all social justice groups will run in to this issue, whether they are religious or not.
2) Unitarian Universalists do not demonstrate any particular tendency towards an environment where “questioning is declared heretical by some authority”. Unitarians are “dispassionate, upfront, and open to argument” on roughly the same level as LessWrong.
What I would be interested in hearing is actual evidence that I could use to update either of these.
To the previous evidence offered: I do not understand how having a consistent stance on an organisational level is evidence that they are close-minded or otherwise less open to discussing and debating opposing viewpoints.
If your thought process consists entirely of “having a consistent organisational stance means you have unquestionable dogma” then I think we are either running in to a definitions issue, or will have to agree to disagree. Otherwise I’d be curious if you can elaborate on the missing pieces.
I think we might have ended up off-track, so let me try to sum up my stance:
I did the same in my new reply to your previous post. Let me just address one side point:
Unitarian Universalists, by default, must have “arbitrary positions” because they are not discussing settled matters. Therefore, the fact that they have arbitrary positions in and of itself is simply a function of their focus; all social justice groups will run in to this issue, whether they are religious or not.
The best method of operation for a social justice group which wishes to find optimal conclusions may be to hold off on proposing solutions. Getting stuck in a position that’s incorrect or not useful seems like a serious concern. There shouldn’t necessarily be a consensus position on a given issue, regardless of the goal of the group.
The best method of operation for a social justice group which wishes to find optimal conclusions may be to hold off on proposing solutions.
Mmm, my gut response is thinking that there are not a lot of solved social issues so this wouldn’t be very useful for a social justice group that actually wanted to get things done? The UUs have been fairly politically active in spreading their values for a while, and I haven’t seen any evidence that their politics is particularly ineffective for their values.
For clarity: How do you think the members of your local UU congregation would react if one of their members turned up one day and said something along the lines of “you know, I’ve been thinking about it and doing the math, and it looks to me like war is actually pretty useful, instrumentally—it seems like it saves more lives than it takes, and at least in places with recruitment methods like ours, people who choose to be soldiers seem to get a fairly good deal out of it on average”?
Or did you mean the kind of lpoliicies that count as “left wing” in the US, and liberal/moderate/centre-left everywhere else.
“Everywhere else”? I hate to break the news, but there are other places under the Sun besides the Anglosphere and Western Europe! In most of the world, both by population and surface area, and including some quite prosperous and civilized places, many UU positions would be seen as unimaginably extremist. (Try arguing their favored immigration policies to the Japanese, for example.)
You are however correct that in other Western/Anglospheric countries, the level of ideological uniformity in the political mainstream is far higher than in the U.S., and their mainstream is roughly similar to the UU doctrine on many issues, though not all. (Among their intellectual elites, on the other hand, Unitarian Universalism might as well be the established religion.)
In any case, I didn’t say that the UUs had the most extreme left-wing positions on everything. On the contrary, what they espouse is roughly somewhere on the left fringe of the mainstream, and more radical leftist positions are certainly conceivable (and held by some small numbers of people). What is significant for the purposes of this discussion is the apparent ideological uniformity, not the content of their doctrine. My points would hold even if their positions were anywhere to the left or right of the present ones, as long as they were equally uniform.
Point taken, and thanks for the interesting link. Googling around a bit more, it seems like there are a few groups like these, but they are small and extreme outliers without influence and status. Before writing my above comments, I checked out the links on the first few search pages that come up when you google “Unitarian Universalist,” and I definitely encountered perfectly predictable and uniform positions advocated on those.
Yes, I have rummaged around his website already. There is some interesting stuff there.
Interestingly, in the “Market for Sanctimony” article, he confirms my impressions about Unitarian Universalism, contrary to the claims of User:handoflixue:
Officially, UU does not have a creed. A consequence of this is that any psychological needs that depend on getting together with co-believers are likely to be frustrated at a UU church. This in turn leads people to promote hard left-wing politics as an unofficial creed. [...] Thus a church that prides itself on not asking people to check their minds at the door ends up doing it anyway, just in a different fashion.
he confirms my impressions about Unitarian Universalism, contrary to the claims of User:handoflixue:
My claim was about unquestionable dogma, and the UUs as a whole. I’m not sure how we can still be having this debate after someone else provided you links to UUs who question the dogma...
My concern is about using the term “left wing” in contexts that have nothing to do with socialism. Being pro immigration is also a policy of some libertarians, so that doesn’t qualify.
Having been raised Unitarian Universalist, I always find it very odd that “religion” is conflated with “unquestionable dogma”. I don’t think Unitarians have that any more than LessWrong does.
I was raised a Unitarian Universalist too, by agnostic parents. It probably has a lot to do with my generally positive attitude towards religion. (I now sing in a High Anglican church choir and attend services regularly mostly because I find it benefits my mental health.)
I’d be happy to answer that. But for purposes of keeping the thread more on the community organization topic, I wanted to channel discussion of my religious beliefs over on this discussion thread. Would you like to repost your comment over there?
Hmm? Thomas Bayes was a Presbyterian minister, C. S. Peirce was Catholic and Newton was an unorthodox Christian described as “highly religious”. I’d be more interested in seeing a list of esteemed rationalists who were not religious compared to such a list that were religious. In any case, it is pretty clear that it is possible to hold rationality and religion in your head at the same time. This is basically how most people operate.
While I think there exists a level at which mainstream religious faith is inimical to epistemic rationality, I also think it’s most likely a pretty advanced level, higher than most if not all of the regulars here have attained. (Note however that people can and do give up religion on grounds of rationality before hitting that level.) It’s certainly possible to make substantial contributions to the advancement of human rationality in its present state while also being a theist, and that was still truer a few hundred years ago when the foundations of the art were being laid.
That being said, there’s also a distinction to be made between esteemed rationalists and esteemed scientists or mathematicians whose work contributed indirectly to LW-method rationality. Of the people that worked on the early foundations of statistics, Laplace is the only one I can think of offhand that strikes me as having had strong public commitments to rationality in this site’s usual sense.
In any case, it is pretty clear that it is possible to hold rationality and religion in your head at the same time. This is basically how most people operate.
More generally, “In any case, it is pretty clear that it is possible to hold rationality and irrationality in your head at the same time. This is basically how most people operate.” I’m no more surprised to hear about a religious rationalist than I am when I notice yet another of my own irrational beliefs or practices.
He must be talking about LW-style rationality or X-rationality as distinguished from traditional rationality. And learning about X-rationality has been known to deconvert people on whom the traditional rationality -based arguments of Dawkins and skeptics didn’t work. And then there are additional arguments for why X-rationality is the real thing and deserves to be called just rationality.
It seems that there are significant numbers of Jews, Anglicans and Shintoists that don’t believe in the theology and the supernatural stuff, but still identify as members of the religion and follow the traditions (I don’t know if there are any of those among Mormons though, though after hearing calcsam I’d increase my expectation).
It seems that there are significant numbers of Jews, Anglicans and Shintoists that don’t believe in the theology and the supernatural stuff, but still identify as members of the religion and follow the traditions
Culture is pretty strong. My girlfriend oscillates between Christianity and Paganism and is an active member of the local Church of England. They’re currently trying to draft her as a volunteer for all sorts of things (the standard punishment for public display of sanity and competence). I’m a sceptical atheist and I’m on the fringes of but still pretty much somewhat part of said church community. Mind you, the C of E is hardly that unfriendly to atheists … and Richard Dawkins still visits his and neither it nor he catch fire as he walks in …
Or: Yes, religion is supposedly about the silly ideas and mad old books, but only works if it’s a community, and these will often include people who expressly repudiate the silly ideas and mad old books.
“If the Church of England relied on Christians, it’d be sharing a room with the Flat Earth Society”—Shelley (TV show), quoted from memory.
I can think of at least two other stable states—in one, you’ve had an experience that has acted as strong Bayesian evidence for you of the evidence of $DEITY, but which is either a purely subjective experience or which is non-repeatable. As an example of this class of event, if I were to pray “Oh Lord, give me enough money to never have to work again” and then two hundred thousand people were to buy copies of my books in the next five years, that would be enough evidence that it would be rational for me to believe in God.
Another stable state might be someone who has been convinced by Frank Tipler’s Omega Point hypothesis. Tipler himself is now clearly extremely irrational, but the hypothesis itself is taken seriously enough by people like David Deutsch (who is one of the less obviously-egregiously-stupid public intellectuals) that it’s not obviously dismissable out-of-hand.
I’m sure there are others, too.
EDIT—when I said “in the next five years” I meant to type “the next five minutes”, which would of course be much stronger evidence.
As an example of this class of event, if I were to pray “Oh Lord, give me enough money to never have to work again” and then two hundred thousand people were to buy copies of my books in the next five years, that would be enough evidence that it would be rational for me to believe in God.
Do you really think that would be enough? Even if you don’t think that the God hypothesis has a truly massive prior probability to overcome, you’d still have to reconcile this with the fact that most prayers for improbable things go unanswered, to the point that nobody has ever provided a convincing statistical demonstration that it has any effect except on people who know that prayers have been made.
Taking this as sufficient Bayesian evidence to hold a belief in God seems like believing that a die is weighted because your roll came up a six, when you know that it’s produced an even distribution of numbers in all its rolls together.
It doesn’t seem to me to be possible to hold both rationality and religion in one’s head at the same time without compartmentalization, which is one of the things rationality seeks to destroy.
As an example of this class of event, if I were to pray “Oh Lord, give me enough money to never have to work again” and then two hundred thousand people were to buy copies of my books in the next five years, that would be enough evidence that it would be rational for me to believe in God.
The reason why rationality destroys religion is precisely because there is no evidence of this kind. It’s not a priori impossible to hold rationality and religion decompartmentalised in one’s head, but it is impossible in this universe.
Even in that case, a very powerful being messing with you is more likely than an uncaused, ontologically fundamental very powerful being (and not just because of the conjunction—a caused, reducible very powerful being is far more likely). Or did you just mean that this point was less obvious, so it would be harder for someone to realize that they were wrong?
Tipler himself is now clearly extremely irrational
Was he more rational before? (I did read two of his books a while back, and I remember being very excited beforehand and very disappointed afterwards, but I can’t remember enough specifics to say why.)
I believe so. His career path seems to go:
70s—studies with John Wheeler, makes some small but clever contributions to cosmology and relativistic physics.
80s—Co-writes widely praised book The Anthropic Cosmological Principle with John Barrow, first suggests Omega Point hypothesis
90s—Writes The Physics Of Immortality, laying out Omega Point hypothesis in much more detail and explicitly identifying Omega Point with God. People think this is clever but going a little far. Tipler’s contract for a textbook on gravitation gets cancelled and the university at which he has tenure stop giving him pay-rises.
2000s—Writes The Physics Of Christianity, in which he suggests cloning Jesus from the Turin Shroud so we can learn how he annihilated baryons, becomes referee for creationist journals and occasional right-wing commentator, argues that Barack Obama is evil because the lumineferous aether is real and because of a bit of the film Starship Troopers.
The criticism of Obama was slightly more coherent than that. The Tribe paper in question really was an example of the common attempt for people to take ideas in math and physics and try to apply them as strong metaphors in other areas in ways that are really unhelpful and at best silly. In that regard, most of Tipler’s criticism was straight on.
he suggests cloning Jesus from the Turin Shroud so we can learn how he annihilated baryons, becomes referee for creationist journals and occasional right-wing commentator, argues that Barack Obama is evil because the lumineferous aether is real and because of a bit of the film Starship Troopers.
Ok that’s really...random. (Overused and underdefined word but that was the response my brain gave me).
But then he’ll come out with a piece of utterly lucid reasoning on applying Bayes’ theorem to the Born probabilities like http://arxiv.org/abs/quant-ph/0611245 . Very, very strange man.
Wow. While I’m unsurprised that Tipler would take issue with yet another poetical injection of something that superficially looks like quantum physics into yet another unrelated subject area, I’m more surprised that he’d express it in such a bizzare manner. There’s a whole paragraph where he name-drops his academic genealogy. And then he acts like Obama is making these claims, when at best he contributed “analytic and research assistance”, whatever that means.
I read The Physics of Immortality as an undergrad in ’04 and was skeptical of his major claims. I’m disappointed by his downward spiral into crackpot territory.
Speaking solely for myself, I’ve found that my spiritual / religious side helps me to set goals and to communicate with my intuitions. Rationality is simply a tool for implementing those goals, and processing/evaluating that intuitive data.
I’ve honestly found the hostility towards “spirituality writ large” here rather confusing, as the majority of the arguments seem to focus on a fairly narrow subset of religious beliefs, primarily Christian. I tend to write it off as a rather understandable bias caused by generalizing from “mainstream Christianity”, though, so it doesn’t really bother me. When people present actual arguments, I do try and listen in case I’ve missed something.
Or, put another way: Rationality is for falsifiable aspects of my life, and spirituality is for the non-falsifiable aspects of my life. I can’t have “incorrect” goals or emotions, but I can certainly fail to handle them effectively.
If ‘spirituality’ helps you to handle these things effectively, that is empirically testable. It is not part of the ‘non-falsifiable’ stuff. In fact, whatever you find useful about ‘spirituality’ is necessarily empirical in nature and thus subject to the same rules as everything else.
Most of the distaste for ‘spirituality’ here comes from a lack of belief in spirits, for which good arguments can be provided if you don’t have one handy. If your ‘spirituality’ has nothing to do with spirits, it should probably be called something else.
Hmmmmm, I’d never considered the idea of trying to falsify my goals and emotions before. Now that the idea has been presented, I’m seeing how I can further integrate my magical and rational thinking, and move to a significantly more effective and rational standpoint.
There are stats on the effects of religion on a population that practices said religion. This should give some indication of the usefulness of any spirituality.
You can have goals that presuppose false beliefs. If I want to get to Heaven, and in fact there is no such place, my goal of getting to Heaven at least closely resembles an “incorrect goal”.
This raises an interesting question—if a Friendly AI or altruistic human wants to help me, and I want to go to Heaven, and the helper does not believe in Heaven, what should it do? So far as I can tell, it should help me get what I would want if I had what the helper considers to be true beliefs.
In a more mundane context, if I want to go north to get groceries, and the only grocery store is to the south, you aren’t helping me by driving me north. If getting groceries is a concern that overrides others, and you can’t communicate with me, you should drive me south to the grocery store even if I claim to want to go north. (If we can exchange evidence about the location of the grocery store, or if I value having true knowledge of what you find if you drive north, things are more complicated, but let’s assume for the purposes of argument that neither of those hold.)
This leads to the practical experiment of asking religious people what they would do differently if their God spoke to them and said “I quit. From now on, the materialists are right, your mind is in your brain, there is no soul, no afterlife, no reincarnation, no heaven, and no hell. If your brain is destroyed before you can copy the information out, you’re gone.” If a religious person says they’d do something ridiculous if God quit, we have a problem when implementing an FAI, since the FAI would either believe in Heaven or be inclined to help religious people do something ridiculous.
So far, I’ve had one Jehovah’s Witness say he couldn’t imagine imagine God quitting. Everyone else said they wouldn’t do much different if God quit.
If you do this experiment, please report back.
It would be a problem if there are many religious people who would apparently want to commit suicide if their God quit, the FAI convinces itself that there is no God, so it helpfully goes and kills them.
Erm, that’s supposing the religious person would actually want to suicide or do the ridiculous thing, rather than this itself being an expression of belief, affirmation, and argument of the religion. (I.e., as appeal to consequences, or saying negative things about the negation.)
Erm, that’s supposing the religious person would actually want to suicide or do the ridiculous thing, rather than this itself being an expression of belief, affirmation, and argument of the religion. (I.e., as appeal to consequences, or saying negative things about the negation.)
The most reasonable interpretation I can find for your statement is that you’re responding to this:
If a religious person says they’d do something ridiculous if God quit, we have a problem when implementing an FAI, since the FAI would either believe in Heaven or be inclined to help religious people do something ridiculous.
I agree, the goal would be to figure out what they would want if their beliefs were revised, and revising their circumstances so that God puts Himself out of the picture isn’t quite the same as that.
The experiment also has other weaknesses:
Ebay bidding shows that many people can’t correctly answer hypothetical questions. Perhaps people will accidentally give false information when I ask.
The question is obviously connected with a project related to athiesm. Perhaps some religious people will give false answers deliberately because they don’t want projects related to athiesm to succeed.
The relevant question is what the FAI thinks they would want if there were no God, not what they think they would want. A decent FAI would be able to do evolutionary psychology and many people can’t, especially religious people who don’t think evolution happened.
It’s not a real experiment. I’m not systematically finding these people, I’m just occasionally asking religious people what they think. There could easily be a selection effect since I’m not asking this question of random religious people.
We are at high risk of arguing about words, and I don’t wish to do that.
Describe specifically what you do when you’re using your spiritual side. Assign it a label other than “spirituality” or “religious”. Then I can give you my opinion. As stated your comment is noise.
You can have incorrect subgoals in that they fail to help you achieve the goals towards which they are supposed to aim.
According to one popular view, you can have incorrect emotions—and this is important, as our emotions have a great deal to do with our ability to be rational. To quote:
Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts. If the iron approaches your face, and you believe it is hot, and it is cool, the Way opposes your fear. If the iron approaches your face, and you believe it is cool, and it is hot, the Way opposes your calm. Evaluate your beliefs first and then arrive at your emotions. Let yourself say: “If the iron is hot, I desire to believe it is hot, and if it is cool, I desire to believe it is cool.” Beware lest you become attached to beliefs you may not want.
I can’t have “incorrect” goals or emotions, but I can certainly fail to handle them effectively.
Maybe you disagree, but from what I’ve seen, a large subset of the LW population thinks that both goals and emotions can and should be modified if they are sub-optimal.
I can see handoflixue’s logic, and your appeal to popularity does not defeat it. It makes LW seem to be irrational. To directly answer the logic, remind handoflixue that goals form a hierarchy of goals and subgoals, and a subgoal can be incorrect relative to a goal. Similarly, emotions can be subservient to goals. For example, anger can serve the goal of self-protection. A specific feeling of anger can then be judged as correct or incorrect depending on whether it serves this goal.
Finally, all of our conscious goals can be judged from the standpoint of natural selection. And conversely, a person may judge natural selection from the point of view of his conscious goals.
To directly answer the logic, remind handoflixue that goals form a hierarchy of goals and subgoals, and a subgoal can be incorrect relative to a goal.
That...seems true. I guess I’ve never divided my goals into a hierarchy, and I often find my emotions annoying and un-useful. I think my comment holds more true for emotions than for goals, anyway. I’ll have to think about this for a while. It’s true that although I have tried to modify my top-level goals in the past, I don’t necessarily do it because of rationality.
If you present your best case for LDS, and no one here considers it persuasive enough to convert to LDS, will you take this as strong evidence that you are mistaken? Enough to relinquish LDS?
if you present your best case for LDS, and no one here considers it persuasive enough to convert to LDS,
Other people understand this differently (e.g. I can not speak to calcsam) but as far as I am concerned the goal should never be to persuade anyone to be LDS but to present the message, let them know that you know it is true, and invite them to find out for themselves if it is true, then answer any questions they may have. Then it is to let the Spirit do whatever else it will to bring about a conversion.
Because there are things I can learn here. I can handle the hostility to religion. But if you don’t cross-pollinate, you become a hick.
It doesn’t seem to me to be possible to hold both rationality and religion in one’s head at the same time without compartmentalization, which is one of the things rationality seeks to destroy.
I can actually quite easily accept that it could be a good idea for rationalists to adopt some of the community-building practices of religious groups, but I also think that rationality is necessarily corrosive to religion.
If you’ve squared that circle, I’d be interested to hear how. Being somewhat religious for the social bit but having excised the supernaturalism is the only stable state I can think of.
Yes, in the ideal limit, rationalists don’t compartmentalize. But decompartmentalizing too early, before you learn the skill of sanely resolving conflicts between beliefs and values in different compartments (rather than, for example, going with the one you feel more strongly), is a way to find the deepest, darkest crevices in the Valley of Rationality.
I would agree that there is a level of rationality at which religious beliefs become impossible, though to the extent that a religious person takes their religion seriously, and expects effective rationality to produce accurate beliefs, they should not expect this until it actually happens to them. Though it does occur to me that helping rationalists to establish the level of social support available within the Mormon community (as calcsam is doing) is an effective way of establishing a line of retreat for a religious person who is uncertain about retaining their religious beliefs.
Simple question, but what exactly is meant by “religion” when you say “there is a level of rationality at which religious beliefs become impossible”? I’ve been wondering about this for a while, and find it unclear whether my spiritual side is simply “not actually religion” or if there’s just some huge chunk of rationality that I’m missing. Thus far, the two have felt entirely compatible for me.
I wonder if you could clarify for me what you mean by “spiritual” in “spiritual side”? I was raised as a Roman Catholic, and to me ‘spiritual’ means the other side of Descartes’s dualism—the non-physical side. So, for example, I learned that the Deity and angels are purely spiritual. But being human, my spiritual side is my immortal soul—which pretty much includes my mind.
I’m pretty sure you (and millions of other people who talk about spirituality) mean something different from this, but I have never been able to figure out what you all mean.
A definition of ‘spiritual’ is preferred, but failing that, could you taboo ‘spiritual’ and say what you meant by ‘spiritual side’ without using the word?
More or less, it’s schizophrenic/delusional episodes, with an awareness that this is in fact what they are. Mostly what I use ‘spiritual’ to refer to is that, during these episodes, I tend to pick up a strong sense of ‘purpose’ - high level goals end up developed. I have no clue how I develop these top-level goals, and I’ve never found a way to do it via rationality. Rationality can help me mediate conflicts between goals, conflicts between goals and reality, and help me achieve goals, but it doesn’t seem able to set those top-level priorities.
About the closest I’ve come to doing it rationally is to realise that I’m craving purpose, and do various activities that tend to induce this state. Guided meditation is ideal, since it seems to produce more ‘productive’ episodes. It varies heavily whether I will get any particularly useful purpose out of one of these episodes; many episodes are drifting and purposeless, and others result in either impossible goals or ‘applause light’ goals that have no actual substance attached.
Ostensibly I could try to infer my goals from my emotional preferences, which I’ve been slowly working on as an alternative. Being bi-polar and having a number of other neurological instabilities makes it very difficult to get any sort of coherent mapping there, beyond very basic elements like ‘will to live’. Even those basics can be unstable: For about a year I had no real preference on my own survival due to a particularly bad schizophrenic episode.
I’d actually be rather curious how others handle the formation of top-level goals :)
I do also notice certain skills that I’m much more adept at when I’m having such an episode. I’ve observed this empirically, and can come up with rational explanations for it. I’m pretty certain the same results could be replicated rationally, either by studying the skills or by figuring out what I’m doing different during the schizophrenic episodes. I don’t feel that ‘spiritual’ is necessarily a good label for this aspect; “intuition” or simply “changing my perceptual lens on reality” seem more accurate. I mention it here simply because it happens to stem from the same source (schizophrenic episodes)
I find I have very little emotion attached to my highest-level goals. I’m not sure but I think I derive them by abstracting from my lower-level goals, which are based more on habit and emotion, and from ideas I absorb from books, etc. I then use them to try and make my lower-level goals less contradictory.
Yeah, this does not seem to have much to do with what we are usually talking about when discussing religion, supernaturalism, etc.
FWIW, I typically use the term in a secular sense to refer to those with interests in items from this list:
meditation, religious experiences, drugs, altered states, yoga, chanting, buddhism, taoism, other eastern mysticism, martial arts and self-improvement.
One reason amongst many: inasmuch as your religion includes unquestionable dogma, it is anathema to rationality. (It is for this reason that being a philosopher, I am non-religious for methodological reasons; dead dogma is not allowed). Having a belief that you cannot question is effectively giving it a probability of 1, which will distort the rest of your Bayesian network in terrible ways. See Infinite Certainty.
Having been raised Unitarian Universalist, I always find it very odd that “religion” is conflated with “unquestionable dogma”. I don’t think Unitarians have that any more than LessWrong does.
That said, if “religion” is being used as a shorthand for “unquestionable dogma”, then the comments about religion make significantly more sense :)
I highly doubt that. For one, a glance at a typical Unitarian web page will show a comprehensive and consistent list of left-wing ideological positions. Are you really claiming that if one were to express deep disagreement with those among the Unitarians, the reactions would be equally dispassionate, upfront, and open to argument as they usually are when the prevailing opinion is challenged on LW? (Not that LW is perfect in this regard either, but compared to nearly any other place, credit must be given where it’s due.)
Of course, some would claim that old-fashioned religious dogma is somehow incomparably worse and more irrational than modern ideological dogma, so much that the UU stuff doesn’t even deserve that designation. However, I don’t think this position is defensible, unless we insist on a rather tortured definition of “dogma.”
The more I think about it, the more I find it difficult to answer this question. The main obstacle I’m running up against is that the two have very different communication styles, so the answer varies heavily depending on which communication style you’re seeking.
In my experiences, LessWrong is a very blunt, geeky approach to communication. It is also post-based, and thus neither real-time nor face-to-face. It’s very good at problem solving and science. People are likely to try and refute my stance, or treat it as a factual matter to be empirically tested.
Unitarian Universalist churches, by contrast, have been very polite and mainstream in their approach to communication. It’s also in-person, and real-time interaction. They’re very good at making people feel welcome and accepted. People are likely to simply accept that I happen to believe differently than them. People are likely to treat strong assertions as an article of faith, and therefore not particularly worth challenging.
I can’t really find a way to translate between these two, so I can’t really compare them.
Viewed through a mainstream, polite filter, I see LessWrong as a place that is actively hateful of religion, and extremely intolerant towards it, to the point of being willing to reject perfectly useful ideas simply because they happen to come from a religious organization.
Viewed through the blunt, geeky filter, I see UUs as blindly accepting and unwilling to actually challenge and dig in to an idea; I feel like I can have a very interesting discussion, but in many respects I’m a lot less likely to change someone’s mind (although, in other respects, I’d have a lot more luck using Dark Arts to manipulate a church-goer)
Well, there are seven formal UU values:
*The inherent worth and dignity of every person;
*justice, equity and compassion in human relations;
*world peace, liberty and justice for all; and
*respect for the interdependent web of all existence.
*Acceptance of one another and encouragement to spiritual growth in our congregations;
*a free and responsible search for truth and meaning; and
*the right of conscience and the use of the democratic process within our congregation and in society at large.
I would consider the first four to be values that are roughly shared with LessWrong, although there are definitely some differences in perspective. The fifth one, UUs focus on spiritual growth, LW focuses on growing rationality. The sixth principle is again shared. The seventh seems implemented in the LessWrong karma system, and I’d actually say LW does better here than the UUs.
It’s also worth noting that these are explicitly “shared values”, and not a creed. The general attitude I have seen is that one should show respect and tolerance even to people who don’t share these values.
LessWrong is a place for rationalists to meet and discuss rationality. UU Churches are a place for UUs to meet and discuss their shared values. It doesn’t serve LessWrong to have it dominated by “religion vs rationality” posts, nor posts trying to sell Christianity or de-convert rationalists. It doesn’t serve the UUs to have church dominated by challenges to those values.
This is a list of applause lights, not a statement of concrete values, beliefs, and goals. To find out the real UU values, beliefs, and goals, one must ask what exact arrangements constitute “liberty,” “justice,” etc., and what exact practical actions will, according to them, further these goals in practice. On these questions, there is nothing like consensus on LW, whereas judging by the uniformity of ideological positions espoused on the Unitarian/UU websites, there does seem to be a strong and apparently unchallenged consensus among them.
(To be precise, the applause lights list does include a few not completely vague goals, like e.g. “world peace,” but again, this says next to nothing without a clear position on what is likely to advance peace in practice and what to do when trade-offs are involved. There also seems to be one concrete political position on the list, namely democratism. However, judging by the responses seen when democracy is questioned on LW, there doesn’t seem to be a LW consensus on that either, and at any rate, even the notion of “democracy” is rather vague and weasely. I’m sure that the UU folks would be horrified by many things that have, or have historically had, firm democratic support in various places.)
The core theme I’ve seen repeated across congregations is the “seven core principles” that I posted above. I’ve seen some degree of ideological consistency across those, but I’ve attended seen quite a few sermons discussing various perspectives on the seven core principles. It seems like a fairly common tradition to even invite speakers from other religions or affiliations to come and share their own thoughts.
Certainly a bias towards those who are “compatible” with the group consensus, and there is some degree of “group think”. LessWrong has this going for it as well, though: there’s a strong thread of anti-religion bias, and I’d say there’s a moderate pro-cryonics/singularity bias. I don’t see a lot of posts about how SIAI is a waste of time and money, or how Christianity is really misunderstood and we should come to embrace our Lord and Saviour, Jesus Christ.
Can you can point to something specific in the UU literature that makes you feel that they’re less tolerant to dissent than LessWrong?
Before I even click at a link to a Unitarian Universalist website, I know with very high probability that there is going to be a “social justice” section espousing ideological positions on a number of issues. And for any such section, I can predict with almost full certainty what precisely these positions will be before I even read any of it.
Now, the UU folks would probably claim that such agreement exists simply because these positions are correct. However, even if I agreed that all these positions are correct, given the public controversy over many of these issues, it would still seem highly implausible that such ideological uniformity could be maintained in practice in a group highly tolerant of dissent. In contrast, I see nothing comparable on LW.
You say:
Actually, in my opinion, LW does have its collective quirks and blind spots, but you’re nowhere close to pinpointing them.
Regarding SIAI being a waste of time and money, I’ve seen such opinions raised in several threads without getting downvoted or otherwise creating any drama. (I can dig up some links if you insist.) As long as you make a polite and coherent argument, you won’t elicit any hostility by criticizing SIAI.
Regarding religious proselytism, that is generally considered impolite anywhere. On the other hand, I actually do believe that there is a lot of misunderstanding of religion on LW, in the sense of many people having a “reversed stupidity” attitude towards various religious teachings and beliefs, developing “applause lights” reactions to various loudmouth atheists who bash traditional religion but believe far crazier stuff instead, etc., etc. I have made arguments along these lines on occasions, and I’ve never encountered any hostility in response, just reasonable counterarguments.
Regarding cryonics, it may well be that the average opinion on LW is heavily biased in favor of it. But again, if you want to argue that cryonics is bunk, you’ll be welcome to do so as long as you have something new, intelligent, and well-informed to say about it. (In fact, I remember posts from people who solicited for anti-cryonics arguments.)
In contrast to these topics, one that usually destroys the quality of discourse on LW are gender issues. This really is a recurring problem, but then, I seriously doubt that a diversity of views on these issues is welcome among UUs. Another problem are certain topics whose understanding requires familiarity with some peculiar theories that are discussed on LW occasionally, where certain (seemingly) very theoretical and far-fetched speculations are apparently taken seriously enough by some of the prominent people here that discussing them can lead to bizarre drama. None of this however comes anywhere close to the ideological uniformity that I observe among the Unitarian Universalists, at least judging from their internet presence.
I suppose I should reiterate this, as it seems to be unclear: My point was not that UUs don’t have a degree of “group consensus.” My point was that they do not treat it as an unquestionable dogma.
That they generally have a “social values” page does not seem at all contradictory to this—the issue is whether they’re willing to entertain discussion from opposing views.
In my (anecdotal) experience as someone who has actually attended UU churches, the answer has been very strongly yes. If you have actual experiences to the contrary, or have seen websites from them that seem to make it vividly clear that dissent is not tolerated, I’d be genuinely curious to see this. It’s entirely possible that my experiences aren’t typical, but I haven’t seen any evidence to support that theory.
Tangentially: The discussion of actual issues and biases on LessWrong is appreciated. I’ve only been here briefly, so I haven’t really gotten to know the community that well yet.
This was sadly not clear in my original post, but my goal was to compare “looking at a public website” to “reading top-level posts”. I’ve never seen a top-level post supporting Christianity or condemning the SIAI here. On an individual level, I’m sure there are people that hold those stances, just as there are individual UU members who don’t agree with the values you’re seeing on the UU websites.
My point was simply “when you look at the ‘public face’ of an organisation, you’re going to see some degree of consensus, because that’s just how human organisations work”
LessWrong FAQ:
You don’t see a lot of posts about how gravity doesn’t really exist and it’s just the Flying Spaghetti Monster pushing us down with his tentacles, either.
Note the previous part of the sentence by Vladimir_M that you quoted: (emphasis added)
There’s a difference between consensus on empirical questions where the evidence falls overwhelmingly on one side, and consensus on higher-level ideological questions with a much less clear distribution of both evidence and arguments.
And my original post:
I’m not sure how pointing out that LessWrong explicitly has unquestionable dogma disproves my point.… That LessWrong’s dogma is primarily about scientific/empirical/factual matters is simply a function of it’s focus: LessWrong is about that sort of thing, whereas Unitarian Universalism is about social justice, community, and spirituality.
So, when you put it that way, I’d actually say the UUs have vastly less questionable dogma.
Nope. There’s a big difference between “settled issues where questioning is a waste of time and effort” and “arbitrary positions where questioning is declared heretical by some authority (either a person or social mores).”
This sounds like a separate magisteria argument.
Well, yes. You’re defining this yourself: LessWrong is about “settled issues” of science, and therefore it’s okay to dismiss debate as a “waste of time and effort”. Unitarian Universalists are about significantly more arbitrary positions, and therefore there’s a lot more room for discussion, because people have different starting assumptions and/or goals.
Science does have the advantage that, more or less, everyone is willing to accept the same starting assumptions. Social justice and morality do not run in to that.
If you take the starting assumptions of the UUs as a given, then most of their stances are settled issues where questioning is a waste of time and effort. You can still have some really interesting discussions on corner cases and implementations, since the world is very chaotic and no one has yet managed to arrange a control group for controlled study :)
Of course, the UU stated stances are still fairly vague, so even within those, there’s the question of whether violence is ever okay, etc.
All this really boils down to the question:
What evidence, exactly, do you have that Unitarian Universalists declare things ‘heretical’ significantly more often than LessWrong does?
No, Less Wrong isn’t about settled issues, but they do come up fairly often in the course of relevant discussions. Separate magisteria arguments fail because they imply that consensus can be found based on different standards of evidence for different areas of discussion. Every area needs to be held to the same standard.
I’m not sure what the UU starting assumptions are. However, it seems unlikely that they are only terminal values, so standards of evidence should apply.
The point of the first post that I made in this chain is that coming to a consensus based on overwhelming evidence is not the same as declaring something heretical.
You seem to be pursuing two lines of argument. In some places you’re just asserting that UU does not have dogmatic elements, in contradiction to Vladimir_M’s observations. That’s a separate conversation, and not really my concern.
In other places, though, you’re asserting that LW does have dogmatic elements. I have two problems with this. First, it’s not accurate, as I’ve explained. Second, taking the two lines of argument together, it sounds like you’re saying “UU doesn’t have dogma… and anyway, LW does too!” The two clearly aren’t consistent, so which is it?
Just to be clear, my main point is that LW doesn’t have dogma or declare things heretical, not that UU does (although I think it might approach those things in some areas). For that point, I’m providing examples and descriptions of the difference between consensus based on overwhelming evidence and arbitrary dogma. Dogma is arbitrarily absolute; it’s something to be questioned under no circumstances. Consensus based on evidence is a matter of Bayesian updating.
Different definitions of dogma. The easiest translation would be “based on this usage of the word dogma, neither the UUs nor LW have it. Based on this other usage of the word dogma, both the UUs and LW seem to have it about equally. I can’t see any evidence that either definition results in the UUs having more dogma, and I can’t think of a third definition that makes sense, so I’m not sure why you’re insisting that the UUs are more dogmatic”.
English sucks for handling different definitions of the same word, and my brain does a wonderful job of not noticing when I’ve done this ^^;
Ahh, okay. Then I think we’re actually on the same page. I was reading your “arbitrary absolutes” as being a reference to the UUs specifically. This makes much more sense now :)
An unchallenged consensus on positions of social policy, which are complicated and generally do not have conclusive evidence on one side of an argument, indicates the existence of some reinforcing social mores.
Edit: the comment at which this reply was directed was significantly altered after I typed this reply. Please hold on while I attempt to catch up.
I think we can both agree that even LessWrong has social mores. The topic is “unquestionable dogma.”
Having been to a UU church and attended UU sermons, I cannot understand how you could possibly portray it as an “unchallenged consensus”.
Edit: Sorry about the edit, and completely understood :)
I think we might have ended up off-track, so let me try to sum up my stance:
1) Unitarian Universalists, by default, must have “arbitrary positions” because they are not discussing settled matters. Therefore, the fact that they have arbitrary positions in and of itself is simply a function of their focus; all social justice groups will run in to this issue, whether they are religious or not.
2) Unitarian Universalists do not demonstrate any particular tendency towards an environment where “questioning is declared heretical by some authority”. Unitarians are “dispassionate, upfront, and open to argument” on roughly the same level as LessWrong.
What I would be interested in hearing is actual evidence that I could use to update either of these.
To the previous evidence offered: I do not understand how having a consistent stance on an organisational level is evidence that they are close-minded or otherwise less open to discussing and debating opposing viewpoints.
If your thought process consists entirely of “having a consistent organisational stance means you have unquestionable dogma” then I think we are either running in to a definitions issue, or will have to agree to disagree. Otherwise I’d be curious if you can elaborate on the missing pieces.
I did the same in my new reply to your previous post. Let me just address one side point:
The best method of operation for a social justice group which wishes to find optimal conclusions may be to hold off on proposing solutions. Getting stuck in a position that’s incorrect or not useful seems like a serious concern. There shouldn’t necessarily be a consensus position on a given issue, regardless of the goal of the group.
Mmm, my gut response is thinking that there are not a lot of solved social issues so this wouldn’t be very useful for a social justice group that actually wanted to get things done? The UUs have been fairly politically active in spreading their values for a while, and I haven’t seen any evidence that their politics is particularly ineffective for their values.
Nevertheless there are some from time to time, as well as comments to effect and many more that are ambivalent.
For clarity: How do you think the members of your local UU congregation would react if one of their members turned up one day and said something along the lines of “you know, I’ve been thinking about it and doing the math, and it looks to me like war is actually pretty useful, instrumentally—it seems like it saves more lives than it takes, and at least in places with recruitment methods like ours, people who choose to be soldiers seem to get a fairly good deal out of it on average”?
I’ve been to sermons on exactly that topic, so I’d have to argue that in my experience they take it very well.
Dictatorship of the Proletariat? Class struggle? Ownership of the means of Production? Universal Free Healthcare, even?
Or did you mean the kind of lpoliicies that count as “left wing” in the US, and liberal/moderate/centre-left everywhere else.
“Everywhere else”? I hate to break the news, but there are other places under the Sun besides the Anglosphere and Western Europe! In most of the world, both by population and surface area, and including some quite prosperous and civilized places, many UU positions would be seen as unimaginably extremist. (Try arguing their favored immigration policies to the Japanese, for example.)
You are however correct that in other Western/Anglospheric countries, the level of ideological uniformity in the political mainstream is far higher than in the U.S., and their mainstream is roughly similar to the UU doctrine on many issues, though not all. (Among their intellectual elites, on the other hand, Unitarian Universalism might as well be the established religion.)
In any case, I didn’t say that the UUs had the most extreme left-wing positions on everything. On the contrary, what they espouse is roughly somewhere on the left fringe of the mainstream, and more radical leftist positions are certainly conceivable (and held by some small numbers of people). What is significant for the purposes of this discussion is the apparent ideological uniformity, not the content of their doctrine. My points would hold even if their positions were anywhere to the left or right of the present ones, as long as they were equally uniform.
There are some conservative Universal Unitarians, which seems to indicate that there isn’t complete ideological uniformity.
Point taken, and thanks for the interesting link. Googling around a bit more, it seems like there are a few groups like these, but they are small and extreme outliers without influence and status. Before writing my above comments, I checked out the links on the first few search pages that come up when you google “Unitarian Universalist,” and I definitely encountered perfectly predictable and uniform positions advocated on those.
In case you haven’t encountered him before, Peter A Taylor, the author of that FAQ has some interesting articles on religion and politics: Rational Religion, The Market for Sanctimony, or Yet Another Space Alien Cult, What Does “Morality” Mean?, etc. - he apparently is a reader of LessWrong.
Yes, I have rummaged around his website already. There is some interesting stuff there.
Interestingly, in the “Market for Sanctimony” article, he confirms my impressions about Unitarian Universalism, contrary to the claims of User:handoflixue:
My claim was about unquestionable dogma, and the UUs as a whole. I’m not sure how we can still be having this debate after someone else provided you links to UUs who question the dogma...
My concern is about using the term “left wing” in contexts that have nothing to do with socialism. Being pro immigration is also a policy of some libertarians, so that doesn’t qualify.
I was raised a Unitarian Universalist too, by agnostic parents. It probably has a lot to do with my generally positive attitude towards religion. (I now sing in a High Anglican church choir and attend services regularly mostly because I find it benefits my mental health.)
Given that Unitarianism was originally Christian and yet some UU’s have collectively embraced atheism, you are probably right about that.
ie they won’t burn you at the stake, and they won’t stick around to be questioned when they can find someone else to talk to who’ll agree with them.
I’m curious if you’re being sarcastic or serious. It’s hard to tell online :)
I’d be happy to answer that. But for purposes of keeping the thread more on the community organization topic, I wanted to channel discussion of my religious beliefs over on this discussion thread. Would you like to repost your comment over there?
Hmm? Thomas Bayes was a Presbyterian minister, C. S. Peirce was Catholic and Newton was an unorthodox Christian described as “highly religious”. I’d be more interested in seeing a list of esteemed rationalists who were not religious compared to such a list that were religious. In any case, it is pretty clear that it is possible to hold rationality and religion in your head at the same time. This is basically how most people operate.
While I think there exists a level at which mainstream religious faith is inimical to epistemic rationality, I also think it’s most likely a pretty advanced level, higher than most if not all of the regulars here have attained. (Note however that people can and do give up religion on grounds of rationality before hitting that level.) It’s certainly possible to make substantial contributions to the advancement of human rationality in its present state while also being a theist, and that was still truer a few hundred years ago when the foundations of the art were being laid.
That being said, there’s also a distinction to be made between esteemed rationalists and esteemed scientists or mathematicians whose work contributed indirectly to LW-method rationality. Of the people that worked on the early foundations of statistics, Laplace is the only one I can think of offhand that strikes me as having had strong public commitments to rationality in this site’s usual sense.
People who solved math problems useful for rationality but espoused false beliefs would not qualify as “esteemed rationalists” in my book.
(Robert Aumann belongs on this list, by the way.)
More generally, “In any case, it is pretty clear that it is possible to hold rationality and irrationality in your head at the same time. This is basically how most people operate.” I’m no more surprised to hear about a religious rationalist than I am when I notice yet another of my own irrational beliefs or practices.
He must be talking about LW-style rationality or X-rationality as distinguished from traditional rationality. And learning about X-rationality has been known to deconvert people on whom the traditional rationality -based arguments of Dawkins and skeptics didn’t work. And then there are additional arguments for why X-rationality is the real thing and deserves to be called just rationality.
Yes.
Nick Szabo has an interpretation of religious traditions that makes sense, though I’m not sure I fully agree.
(Edit) This essay: Is Rational Religion possible? might be of interest too.
It seems that there are significant numbers of Jews, Anglicans and Shintoists that don’t believe in the theology and the supernatural stuff, but still identify as members of the religion and follow the traditions (I don’t know if there are any of those among Mormons though, though after hearing calcsam I’d increase my expectation).
Culture is pretty strong. My girlfriend oscillates between Christianity and Paganism and is an active member of the local Church of England. They’re currently trying to draft her as a volunteer for all sorts of things (the standard punishment for public display of sanity and competence). I’m a sceptical atheist and I’m on the fringes of but still pretty much somewhat part of said church community. Mind you, the C of E is hardly that unfriendly to atheists … and Richard Dawkins still visits his and neither it nor he catch fire as he walks in …
Or: Yes, religion is supposedly about the silly ideas and mad old books, but only works if it’s a community, and these will often include people who expressly repudiate the silly ideas and mad old books.
“If the Church of England relied on Christians, it’d be sharing a room with the Flat Earth Society”—Shelley (TV show), quoted from memory.
I can think of at least two other stable states—in one, you’ve had an experience that has acted as strong Bayesian evidence for you of the evidence of $DEITY, but which is either a purely subjective experience or which is non-repeatable. As an example of this class of event, if I were to pray “Oh Lord, give me enough money to never have to work again” and then two hundred thousand people were to buy copies of my books in the next five years, that would be enough evidence that it would be rational for me to believe in God.
Another stable state might be someone who has been convinced by Frank Tipler’s Omega Point hypothesis. Tipler himself is now clearly extremely irrational, but the hypothesis itself is taken seriously enough by people like David Deutsch (who is one of the less obviously-egregiously-stupid public intellectuals) that it’s not obviously dismissable out-of-hand.
I’m sure there are others, too.
EDIT—when I said “in the next five years” I meant to type “the next five minutes”, which would of course be much stronger evidence.
Do you really think that would be enough? Even if you don’t think that the God hypothesis has a truly massive prior probability to overcome, you’d still have to reconcile this with the fact that most prayers for improbable things go unanswered, to the point that nobody has ever provided a convincing statistical demonstration that it has any effect except on people who know that prayers have been made.
Taking this as sufficient Bayesian evidence to hold a belief in God seems like believing that a die is weighted because your roll came up a six, when you know that it’s produced an even distribution of numbers in all its rolls together.
The reason why rationality destroys religion is precisely because there is no evidence of this kind. It’s not a priori impossible to hold rationality and religion decompartmentalised in one’s head, but it is impossible in this universe.
Even in that case, a very powerful being messing with you is more likely than an uncaused, ontologically fundamental very powerful being (and not just because of the conjunction—a caused, reducible very powerful being is far more likely). Or did you just mean that this point was less obvious, so it would be harder for someone to realize that they were wrong?
Was he more rational before? (I did read two of his books a while back, and I remember being very excited beforehand and very disappointed afterwards, but I can’t remember enough specifics to say why.)
I believe so. His career path seems to go: 70s—studies with John Wheeler, makes some small but clever contributions to cosmology and relativistic physics.
80s—Co-writes widely praised book The Anthropic Cosmological Principle with John Barrow, first suggests Omega Point hypothesis
90s—Writes The Physics Of Immortality, laying out Omega Point hypothesis in much more detail and explicitly identifying Omega Point with God. People think this is clever but going a little far. Tipler’s contract for a textbook on gravitation gets cancelled and the university at which he has tenure stop giving him pay-rises.
2000s—Writes The Physics Of Christianity, in which he suggests cloning Jesus from the Turin Shroud so we can learn how he annihilated baryons, becomes referee for creationist journals and occasional right-wing commentator, argues that Barack Obama is evil because the lumineferous aether is real and because of a bit of the film Starship Troopers.
The criticism of Obama was slightly more coherent than that. The Tribe paper in question really was an example of the common attempt for people to take ideas in math and physics and try to apply them as strong metaphors in other areas in ways that are really unhelpful and at best silly. In that regard, most of Tipler’s criticism was straight on.
Yeah, except for two facts: Obama had no actual input into Tribe’s paper. Tipler’s physics in his paper is even less coherent than Tribe’s.
Ok that’s really...random. (Overused and underdefined word but that was the response my brain gave me).
The Tipler/Obama/aether connection seemed bizarre enough that I looked it up:
http://pajamasmedia.com/blog/obama-vs-einstein/
Some quotes:
Einstein’s general relativity is just a special case of Newtonian gravity theory incorporating the ether
Hamilton-Jacobi theory is deterministic, hence quantum mechanics is equally deterministic
There was absolutely nothing revolutionary about twentieth century physics.
I agree on the “random” part.
That’s nothing. Read the full paper—http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1271310 . Forty-five pages of the most gloriously wrong thinking you’ll ever come across in your life.
But then he’ll come out with a piece of utterly lucid reasoning on applying Bayes’ theorem to the Born probabilities like http://arxiv.org/abs/quant-ph/0611245 . Very, very strange man.
I think this is a relevant rationality quote: http://lesswrong.com/lw/2ev/rationality_quotes_july_2010/28nw
Wow. While I’m unsurprised that Tipler would take issue with yet another poetical injection of something that superficially looks like quantum physics into yet another unrelated subject area, I’m more surprised that he’d express it in such a bizzare manner. There’s a whole paragraph where he name-drops his academic genealogy. And then he acts like Obama is making these claims, when at best he contributed “analytic and research assistance”, whatever that means.
I read The Physics of Immortality as an undergrad in ’04 and was skeptical of his major claims. I’m disappointed by his downward spiral into crackpot territory.
Speaking solely for myself, I’ve found that my spiritual / religious side helps me to set goals and to communicate with my intuitions. Rationality is simply a tool for implementing those goals, and processing/evaluating that intuitive data.
I’ve honestly found the hostility towards “spirituality writ large” here rather confusing, as the majority of the arguments seem to focus on a fairly narrow subset of religious beliefs, primarily Christian. I tend to write it off as a rather understandable bias caused by generalizing from “mainstream Christianity”, though, so it doesn’t really bother me. When people present actual arguments, I do try and listen in case I’ve missed something.
Or, put another way: Rationality is for falsifiable aspects of my life, and spirituality is for the non-falsifiable aspects of my life. I can’t have “incorrect” goals or emotions, but I can certainly fail to handle them effectively.
If ‘spirituality’ helps you to handle these things effectively, that is empirically testable. It is not part of the ‘non-falsifiable’ stuff. In fact, whatever you find useful about ‘spirituality’ is necessarily empirical in nature and thus subject to the same rules as everything else.
Most of the distaste for ‘spirituality’ here comes from a lack of belief in spirits, for which good arguments can be provided if you don’t have one handy. If your ‘spirituality’ has nothing to do with spirits, it should probably be called something else.
Hmmmmm, I’d never considered the idea of trying to falsify my goals and emotions before. Now that the idea has been presented, I’m seeing how I can further integrate my magical and rational thinking, and move to a significantly more effective and rational standpoint.
Thank you!
Glad to be of help :)
There are stats on the effects of religion on a population that practices said religion. This should give some indication of the usefulness of any spirituality.
You can have goals that presuppose false beliefs. If I want to get to Heaven, and in fact there is no such place, my goal of getting to Heaven at least closely resembles an “incorrect goal”.
This raises an interesting question—if a Friendly AI or altruistic human wants to help me, and I want to go to Heaven, and the helper does not believe in Heaven, what should it do? So far as I can tell, it should help me get what I would want if I had what the helper considers to be true beliefs.
In a more mundane context, if I want to go north to get groceries, and the only grocery store is to the south, you aren’t helping me by driving me north. If getting groceries is a concern that overrides others, and you can’t communicate with me, you should drive me south to the grocery store even if I claim to want to go north. (If we can exchange evidence about the location of the grocery store, or if I value having true knowledge of what you find if you drive north, things are more complicated, but let’s assume for the purposes of argument that neither of those hold.)
This leads to the practical experiment of asking religious people what they would do differently if their God spoke to them and said “I quit. From now on, the materialists are right, your mind is in your brain, there is no soul, no afterlife, no reincarnation, no heaven, and no hell. If your brain is destroyed before you can copy the information out, you’re gone.” If a religious person says they’d do something ridiculous if God quit, we have a problem when implementing an FAI, since the FAI would either believe in Heaven or be inclined to help religious people do something ridiculous.
So far, I’ve had one Jehovah’s Witness say he couldn’t imagine imagine God quitting. Everyone else said they wouldn’t do much different if God quit.
If you do this experiment, please report back.
It would be a problem if there are many religious people who would apparently want to commit suicide if their God quit, the FAI convinces itself that there is no God, so it helpfully goes and kills them.
Erm, that’s supposing the religious person would actually want to suicide or do the ridiculous thing, rather than this itself being an expression of belief, affirmation, and argument of the religion. (I.e., as appeal to consequences, or saying negative things about the negation.)
The most reasonable interpretation I can find for your statement is that you’re responding to this:
I agree, the goal would be to figure out what they would want if their beliefs were revised, and revising their circumstances so that God puts Himself out of the picture isn’t quite the same as that.
The experiment also has other weaknesses:
Ebay bidding shows that many people can’t correctly answer hypothetical questions. Perhaps people will accidentally give false information when I ask.
The question is obviously connected with a project related to athiesm. Perhaps some religious people will give false answers deliberately because they don’t want projects related to athiesm to succeed.
The relevant question is what the FAI thinks they would want if there were no God, not what they think they would want. A decent FAI would be able to do evolutionary psychology and many people can’t, especially religious people who don’t think evolution happened.
It’s not a real experiment. I’m not systematically finding these people, I’m just occasionally asking religious people what they think. There could easily be a selection effect since I’m not asking this question of random religious people.
We are at high risk of arguing about words, and I don’t wish to do that.
Describe specifically what you do when you’re using your spiritual side. Assign it a label other than “spirituality” or “religious”. Then I can give you my opinion. As stated your comment is noise.
You can have incorrect subgoals in that they fail to help you achieve the goals towards which they are supposed to aim.
According to one popular view, you can have incorrect emotions—and this is important, as our emotions have a great deal to do with our ability to be rational. To quote:
This comment was also quite helpful :)
Maybe you disagree, but from what I’ve seen, a large subset of the LW population thinks that both goals and emotions can and should be modified if they are sub-optimal.
I can see handoflixue’s logic, and your appeal to popularity does not defeat it. It makes LW seem to be irrational. To directly answer the logic, remind handoflixue that goals form a hierarchy of goals and subgoals, and a subgoal can be incorrect relative to a goal. Similarly, emotions can be subservient to goals. For example, anger can serve the goal of self-protection. A specific feeling of anger can then be judged as correct or incorrect depending on whether it serves this goal.
Finally, all of our conscious goals can be judged from the standpoint of natural selection. And conversely, a person may judge natural selection from the point of view of his conscious goals.
That...seems true. I guess I’ve never divided my goals into a hierarchy, and I often find my emotions annoying and un-useful. I think my comment holds more true for emotions than for goals, anyway. I’ll have to think about this for a while. It’s true that although I have tried to modify my top-level goals in the past, I don’t necessarily do it because of rationality.
If you present your best case for LDS, and no one here considers it persuasive enough to convert to LDS, will you take this as strong evidence that you are mistaken? Enough to relinquish LDS?
If not, why not?
Other people understand this differently (e.g. I can not speak to calcsam) but as far as I am concerned the goal should never be to persuade anyone to be LDS but to present the message, let them know that you know it is true, and invite them to find out for themselves if it is true, then answer any questions they may have. Then it is to let the Spirit do whatever else it will to bring about a conversion.
I think that is an excellent attitude to take.
+1