“The most dangerous religious fundamentalisms lead people to do things such as blowing up buildings, committing mass murders, jailing and torturing people for apostasy, and throwing acid in the faces of schoolchildren. This occurs both when dangerous religious fundamentalists occupy positions of formal political power (governments), and when they do not (terrorist groups, militias, abortion-clinic bombers).”
Point taken. The phrase “most dangerous” iis hyperbolic. No, so far I don’t see any Less Wrongers blowing up buildings or committing mass murders. But, what is it that drives people to do such things? Is it as simple as, “God told me to do this?” I don’t think it’s usually that simple. I’m not sure what drives it, but I think that part of it is a basic human tendency to divide people up into groups of “we” and “they.” Most of us construct this kind of division to some degree, whether we realize it or not, but fundamentalists take it to the extreme. On LW I encounter this division quite often (sometimes in the tone of posts more than the content). I probably notice it so strongly because, as Manfred comments, I feel myself to be among the “them,” (and my natural reaction is to make the same sort of division in my own mind. While this division is nowhere near the extreme in the rationalist communities, I can definitely imagine it becoming so, particularly if technology advances in the ways that many Less Wrongers predict it will.
Some Less Wrongers appear to express the viewpoint that the world would be a better and happier place if all of us were to become rationalists, and I think that this is the attitude that I had in mind when I let the phrase “most dangerous fundamentalists” slip out. Medieval Catholics (and some contemporary ones) wanted to make the whole world Catholic. Stalinists wanted to make the whole world Stalinist. In either case, I think the world would have turned out a much worse place had either one succeeded. To you, rationalism, empiricism and positivism might seem to exist in a different category, but to me any ideology or thought system that gets universalized will probably turn into More’s Utopia or Plato’s Republic. And, while interesting for a while, such places hardly seem very habitable in the long term.
“Is it that just because we are totalizing and dogmatic about making people happy instead of about hating and killing them? (I am reminded of a Barry Goldwater quote about extremism and moderation.)”
I’d be interested in seeing that Goldwater quote. But, if Less Wrongers are totalizing and dogmatic about making people happy, then why on earth would you want to deconvert people from religion? Religious beliefs, practices, rituals, spiritualities, aesthetics, values, and communities bring vast amounts of happiness to people all over the world, every day. No, it’s not for everyone, but why try and take it away from the people who find so much happiness in it?
(A meta remark: The usual way to quote another person’s post here is to
prefix lines with the > character, not to use quotation marks.)
Point taken. The phrase “most dangerous” iis hyperbolic. No, so far I don’t see any Less Wrongers blowing up buildings or committing mass murders.
Of which I am very glad.
But, what is it that drives people to do such things? Is it as simple as, “God told me to do this?” I don’t think it’s usually that simple. I’m not sure what drives it, but I think that part of it is a basic human tendency to divide people up into groups of “we” and “they.”
Tribalism is powerful and problematic indeed. But I’m not convinced
that tribalism alone is sufficient to create eliminationism — here
borrowing Daniel Goldhagen’s term for the belief that it is morally
right and necessary to exterminate the Other. There are lots of places
in the world where distinct tribes coexist, maintaining us/them
distinctions, without massacring each other constantly.
So there must be something else involved.
Most of us construct this kind of division to some degree, whether we realize it or not, but fundamentalists take it to the extreme.
It isn’t really clear to me that all the things that we might label
“fundamentalism” are really the same social phenomena. Sociologically,
there may be different things going on in Fundamentalist Protestantism
(the trope namer); in theocratic regimes such as Iranian Shia or Saudi
Wahhabism; in medieval Catholicism in its persecution of the Cathars,
Albigensians, and conversos — and for that matter in the Stalinist
purges or other “secular” “fundamentalisms”.
Tribalism may be part of it; but doctrinal intolerance — the notion that
people who believe differently should get
bullet — seems
to be another; and authoritarian loyalty seems to be another still.
We could talk about intolerance in general, rather than
“fundamentalism”; but even this raises the difficulty that some people
take peaceful disagreement with their beliefs to be a form of
“intolerance”. There’s not a word for this idea that isn’t fraught
with political conflict.
While this division is nowhere near the extreme in the rationalist communities, I can definitely imagine it becoming so, particularly if technology advances in the ways that many Less Wrongers predict it will.
This is actually an area where I suspect the LW-cluster is much more
universalist than most religionists expect secularists to be. The whole
concept of “the coherent extrapolated volition of mankind” explicitly
takes in all human experience as significant — thus including religious
experience. Religious claims don’t have to be true in order for
religious experience to be significant as an element of human value;
after all, Hamlet isn’t true either ….
(Mind you, I also think that most secularists are more universalist than religionists expect secularists to be.)
Some Less Wrongers appear to express the viewpoint that the world would be a better and happier place if all of us were to become rationalists, and I think that this is the attitude that I had in mind when I let the phrase “most dangerous fundamentalists” slip out.
Here I wonder if we (by which I mean the LW-cluster) have been failing
to communicate what we mean by “rationalist” and “rationality”. One
iteration of our Litany of Tarski goes as follows:
If there is a God who loves me, I desire to believe there is a God who loves me. If there is not a God who loves me, I desire to believe there is not a God who loves me. Let me not become attached to beliefs I may not want.
This is a position of profound submission to the universe. When we
say “rationalist” here, we primarily don’t mean someone who has a
commitment to a particular set of beliefs. We mean someone who wants
their beliefs to be caused by the facts of the universe, whatever those
might turn out to be.
Medieval Catholics (and some contemporary ones) wanted to make the whole world Catholic. Stalinists wanted to make the whole world Stalinist. In either case, I think the world would have turned out a much worse place had either one succeeded. To you, rationalism, empiricism and positivism might seem to exist in a different category, but to me any ideology or thought system that gets universalized will probably turn into More’s Utopia or Plato’s Republic. And, while interesting for a while, such places hardly seem very habitable in the long term.
One might then ask, what sort of world is most likely to cultivate and
promote the kind of diversity you’re advocating here?
But, if Less Wrongers are totalizing and dogmatic about making people happy, then why on earth would you want to deconvert people from religion? Religious beliefs, practices, rituals, spiritualities, aesthetics, values, and communities bring vast amounts of happiness to people all over the world, every day. No, it’s not for everyone, but why try and take it away from the people who find so much happiness in it?
I, personally, don’t spend any particular effort on deconverting anyone.
Not much point: anyone who can be deconverted by less than sufficient
evidence can probably be reconverted by less than sufficient evidence.
I would like, however, to find ways to offer more comfort to people who
have deconverted and lost their religious social support structure, e.g.
been rejected by family. That sort of thing strikes me as acutely unfortunate.
But then, my own Christian family didn’t give me any particularly acute trouble
through my migration from Christian to pagan to atheist.
This is a position of profound submission to the universe. When we say “rationalist” here, we primarily don’t mean someone who has a commitment to a particular set of beliefs. We mean someone who wants their beliefs to be caused by the facts of the universe, whatever those might turn out to be.
Thank you for re-clarifying this (yes, I was aware that this was the LW position). But, do most LW’ers think that it should be everyone’s position?
Medieval Catholics (and some contemporary ones) wanted to make the whole world Catholic. Stalinists wanted to make the whole world Stalinist. In either case, I think the world would have turned out a much worse place had either one succeeded. To you, rationalism, empiricism and positivism might seem to exist in a different category, but to me any ideology or thought system that gets universalized will probably turn into More’s Utopia or Plato’s Republic. And, while interesting for a while, such places hardly seem very habitable in the long term.
One might then ask, what sort of world is most likely to cultivate and promote the kind of diversity you’re advocating here?
Heh, now there’s a question! I personally don’t believe in utopias, but I do believe in making the world better. The difficulty is that “better” means different things to different people, and this is something we can’t ever forget. To answer your question, I think that a society based on moderation and mutual respect/ tolerance for different beliefs is the best one. Canada’s multiculturalism policy comes to mind. There are many flaws with multiculturalism, as it certainly doesn’t guarantee that all social groups are treated fairly by those in power. However, having lived in Canada for some years, I find that this attempt at creating a multicultural society (where people are encouraged to maintain their cultural heritage and language) leads to a more diverse and interesting society than does the assimilationist attitude of the US (my home country) where there is greater pressure to give up old identities/values in order to fit in.
But, do most LW’ers think that it should be everyone’s position?
I won’t presume to speak for most LWers. Speaking for myself, I think we would all be better off if more people’s beliefs were more contingent on mutually observable events. So, yeah. I could be wrong, but I’d love to see the experiment done.
I don’t really think it would be possible to do an experiment here because the very definition of “better” is a question of values, and different people have different values.
And yet, there are many situations in which an observer does in fact look at two groups of people and claim that group A is better off than group B. On your view, are all such observers unjustified in all such claims, or are some of them sometimes justified? (And, if the latter, is there any reason we can’t affect the world so as to create such a situation, wherein we are justified in claiming that people are better off after our intervention than they were before?)
Well, there’s the anthropological concept of the psychic unity of humankind — we may have different values, but our ways of thinking (including our values) are not wholly alien from one another, but have a lot in common.
And there are also things we can say about human values that descend from cultural evolution: we would not expect, for instance, that any culture would exist that did not value its own replication into the next generation. So we would expect that people would want to teach their ideas to their children (or converts), merely because societies that don’t do that would tend to die out and we wouldn’t get to observe them.
But, do most LW’ers think that it should be everyone’s position?
Good question. I haven’t conducted a poll. But more problematically, what does that “should” mean?
It could mean:
“Would everyone be better off if they were more rationalist?” — I think yes, they would, because they would be better equipped to change the world in directions positive for themselves, and for humanity in general. And I think that this notion is pretty strong in the LW-community. Aside from problems such as becoming a clever arguer, we expect that greater rationality should generally help people.
“Is it worthwhile for me to try to get everyone to be more rationalist?” — It isn’t clear to me how much influence over other people’s rationality I can directly have; although I haven’t really tried outside of the LW-community and my (already rather rationalist-friendly) workplace yet. I intend to support CFAR’s educational program, though.
“Would I benefit from treating people as more virtuous, trustworthy, or righteous if they agree with my position regarding rationality than if they don’t?” — No, I don’t really think so. Doing that sort of thing seems more likely to lead to Blue/Green political nonsense than to any beneficial result. (Although it sure is nice to hang out with / be friends with / date people who share some metaphysics and reference points ….)
If none of these, what did you mean by “should” there?
The difficulty is that “better” means different things to different people, and this is something we can’t ever forget.
Sure; however, some of those different things are compatible and others aren’t. Politics shows up when we have to deal with the incompatible ones.
I’m predisposed to like multiculturalism in a lot of ways; it’s pretty, interesting, and yields a wide range of social forms — and cultural products such as food, music, and philosophy. It does pose some serious problems, though, when different cultures have incompatible views of things such as human rights, human dignity, or dispute resolution; or when it’s used as an excuse to restrain people from choosing to leave their local culture in favor of one of their choice; or when politically well-established cultures are valued highly above less well-connected ones.
Rationality in the LW-sense doesn’t presume to tell anyone what their values should be, except insofar as they shouldn’t be self-contradictory. We have a strong notion of the complexity of human value and a healthy suspicion of anyone who tries to simplify it. (A fellow came to my local LW meetup recently and tried to argue that the only value worth speaking of is personal survival. I think “wide-eyed horror” would fairly describe the general reaction that idea received.)
But there’s a large gap between complexity and irreducibility.
“The most dangerous religious fundamentalisms lead people to do things such as blowing up buildings, committing mass murders, jailing and torturing people for apostasy, and throwing acid in the faces of schoolchildren. This occurs both when dangerous religious fundamentalists occupy positions of formal political power (governments), and when they do not (terrorist groups, militias, abortion-clinic bombers).”
Point taken. The phrase “most dangerous” iis hyperbolic. No, so far I don’t see any Less Wrongers blowing up buildings or committing mass murders. But, what is it that drives people to do such things? Is it as simple as, “God told me to do this?” I don’t think it’s usually that simple. I’m not sure what drives it, but I think that part of it is a basic human tendency to divide people up into groups of “we” and “they.” Most of us construct this kind of division to some degree, whether we realize it or not, but fundamentalists take it to the extreme. On LW I encounter this division quite often (sometimes in the tone of posts more than the content). I probably notice it so strongly because, as Manfred comments, I feel myself to be among the “them,” (and my natural reaction is to make the same sort of division in my own mind. While this division is nowhere near the extreme in the rationalist communities, I can definitely imagine it becoming so, particularly if technology advances in the ways that many Less Wrongers predict it will.
Some Less Wrongers appear to express the viewpoint that the world would be a better and happier place if all of us were to become rationalists, and I think that this is the attitude that I had in mind when I let the phrase “most dangerous fundamentalists” slip out. Medieval Catholics (and some contemporary ones) wanted to make the whole world Catholic. Stalinists wanted to make the whole world Stalinist. In either case, I think the world would have turned out a much worse place had either one succeeded. To you, rationalism, empiricism and positivism might seem to exist in a different category, but to me any ideology or thought system that gets universalized will probably turn into More’s Utopia or Plato’s Republic. And, while interesting for a while, such places hardly seem very habitable in the long term.
“Is it that just because we are totalizing and dogmatic about making people happy instead of about hating and killing them? (I am reminded of a Barry Goldwater quote about extremism and moderation.)”
I’d be interested in seeing that Goldwater quote. But, if Less Wrongers are totalizing and dogmatic about making people happy, then why on earth would you want to deconvert people from religion? Religious beliefs, practices, rituals, spiritualities, aesthetics, values, and communities bring vast amounts of happiness to people all over the world, every day. No, it’s not for everyone, but why try and take it away from the people who find so much happiness in it?
I suspect the quote in question is “I would remind you that extremism in the defense of liberty is no vice.”
″… and moderation in the defense of liberty is no virtue.”
(A meta remark: The usual way to quote another person’s post here is to prefix lines with the > character, not to use quotation marks.)
Of which I am very glad.
Tribalism is powerful and problematic indeed. But I’m not convinced that tribalism alone is sufficient to create eliminationism — here borrowing Daniel Goldhagen’s term for the belief that it is morally right and necessary to exterminate the Other. There are lots of places in the world where distinct tribes coexist, maintaining us/them distinctions, without massacring each other constantly.
So there must be something else involved.
It isn’t really clear to me that all the things that we might label “fundamentalism” are really the same social phenomena. Sociologically, there may be different things going on in Fundamentalist Protestantism (the trope namer); in theocratic regimes such as Iranian Shia or Saudi Wahhabism; in medieval Catholicism in its persecution of the Cathars, Albigensians, and conversos — and for that matter in the Stalinist purges or other “secular” “fundamentalisms”.
Tribalism may be part of it; but doctrinal intolerance — the notion that people who believe differently should get bullet — seems to be another; and authoritarian loyalty seems to be another still.
We could talk about intolerance in general, rather than “fundamentalism”; but even this raises the difficulty that some people take peaceful disagreement with their beliefs to be a form of “intolerance”. There’s not a word for this idea that isn’t fraught with political conflict.
This is actually an area where I suspect the LW-cluster is much more universalist than most religionists expect secularists to be. The whole concept of “the coherent extrapolated volition of mankind” explicitly takes in all human experience as significant — thus including religious experience. Religious claims don’t have to be true in order for religious experience to be significant as an element of human value; after all, Hamlet isn’t true either ….
(Mind you, I also think that most secularists are more universalist than religionists expect secularists to be.)
Here I wonder if we (by which I mean the LW-cluster) have been failing to communicate what we mean by “rationalist” and “rationality”. One iteration of our Litany of Tarski goes as follows:
If there is a God who loves me,
I desire to believe there is a God who loves me.
If there is not a God who loves me,
I desire to believe there is not a God who loves me.
Let me not become attached to beliefs I may not want.
This is a position of profound submission to the universe. When we say “rationalist” here, we primarily don’t mean someone who has a commitment to a particular set of beliefs. We mean someone who wants their beliefs to be caused by the facts of the universe, whatever those might turn out to be.
One might then ask, what sort of world is most likely to cultivate and promote the kind of diversity you’re advocating here?
I, personally, don’t spend any particular effort on deconverting anyone. Not much point: anyone who can be deconverted by less than sufficient evidence can probably be reconverted by less than sufficient evidence.
I would like, however, to find ways to offer more comfort to people who have deconverted and lost their religious social support structure, e.g. been rejected by family. That sort of thing strikes me as acutely unfortunate. But then, my own Christian family didn’t give me any particularly acute trouble through my migration from Christian to pagan to atheist.
Thank you for re-clarifying this (yes, I was aware that this was the LW position). But, do most LW’ers think that it should be everyone’s position?
Heh, now there’s a question! I personally don’t believe in utopias, but I do believe in making the world better. The difficulty is that “better” means different things to different people, and this is something we can’t ever forget. To answer your question, I think that a society based on moderation and mutual respect/ tolerance for different beliefs is the best one. Canada’s multiculturalism policy comes to mind. There are many flaws with multiculturalism, as it certainly doesn’t guarantee that all social groups are treated fairly by those in power. However, having lived in Canada for some years, I find that this attempt at creating a multicultural society (where people are encouraged to maintain their cultural heritage and language) leads to a more diverse and interesting society than does the assimilationist attitude of the US (my home country) where there is greater pressure to give up old identities/values in order to fit in.
I won’t presume to speak for most LWers.
Speaking for myself, I think we would all be better off if more people’s beliefs were more contingent on mutually observable events. So, yeah.
I could be wrong, but I’d love to see the experiment done.
I don’t really think it would be possible to do an experiment here because the very definition of “better” is a question of values, and different people have different values.
And yet, there are many situations in which an observer does in fact look at two groups of people and claim that group A is better off than group B. On your view, are all such observers unjustified in all such claims, or are some of them sometimes justified? (And, if the latter, is there any reason we can’t affect the world so as to create such a situation, wherein we are justified in claiming that people are better off after our intervention than they were before?)
Well, there’s the anthropological concept of the psychic unity of humankind — we may have different values, but our ways of thinking (including our values) are not wholly alien from one another, but have a lot in common.
And there are also things we can say about human values that descend from cultural evolution: we would not expect, for instance, that any culture would exist that did not value its own replication into the next generation. So we would expect that people would want to teach their ideas to their children (or converts), merely because societies that don’t do that would tend to die out and we wouldn’t get to observe them.
Good question. I haven’t conducted a poll. But more problematically, what does that “should” mean?
It could mean:
“Would everyone be better off if they were more rationalist?” — I think yes, they would, because they would be better equipped to change the world in directions positive for themselves, and for humanity in general. And I think that this notion is pretty strong in the LW-community. Aside from problems such as becoming a clever arguer, we expect that greater rationality should generally help people.
“Is it worthwhile for me to try to get everyone to be more rationalist?” — It isn’t clear to me how much influence over other people’s rationality I can directly have; although I haven’t really tried outside of the LW-community and my (already rather rationalist-friendly) workplace yet. I intend to support CFAR’s educational program, though.
“Would I benefit from treating people as more virtuous, trustworthy, or righteous if they agree with my position regarding rationality than if they don’t?” — No, I don’t really think so. Doing that sort of thing seems more likely to lead to Blue/Green political nonsense than to any beneficial result. (Although it sure is nice to hang out with / be friends with / date people who share some metaphysics and reference points ….)
If none of these, what did you mean by “should” there?
Sure; however, some of those different things are compatible and others aren’t. Politics shows up when we have to deal with the incompatible ones.
I’m predisposed to like multiculturalism in a lot of ways; it’s pretty, interesting, and yields a wide range of social forms — and cultural products such as food, music, and philosophy. It does pose some serious problems, though, when different cultures have incompatible views of things such as human rights, human dignity, or dispute resolution; or when it’s used as an excuse to restrain people from choosing to leave their local culture in favor of one of their choice; or when politically well-established cultures are valued highly above less well-connected ones.
Rationality in the LW-sense doesn’t presume to tell anyone what their values should be, except insofar as they shouldn’t be self-contradictory. We have a strong notion of the complexity of human value and a healthy suspicion of anyone who tries to simplify it. (A fellow came to my local LW meetup recently and tried to argue that the only value worth speaking of is personal survival. I think “wide-eyed horror” would fairly describe the general reaction that idea received.)
But there’s a large gap between complexity and irreducibility.