This is a position of profound submission to the universe. When we say “rationalist” here, we primarily don’t mean someone who has a commitment to a particular set of beliefs. We mean someone who wants their beliefs to be caused by the facts of the universe, whatever those might turn out to be.
Thank you for re-clarifying this (yes, I was aware that this was the LW position). But, do most LW’ers think that it should be everyone’s position?
Medieval Catholics (and some contemporary ones) wanted to make the whole world Catholic. Stalinists wanted to make the whole world Stalinist. In either case, I think the world would have turned out a much worse place had either one succeeded. To you, rationalism, empiricism and positivism might seem to exist in a different category, but to me any ideology or thought system that gets universalized will probably turn into More’s Utopia or Plato’s Republic. And, while interesting for a while, such places hardly seem very habitable in the long term.
One might then ask, what sort of world is most likely to cultivate and promote the kind of diversity you’re advocating here?
Heh, now there’s a question! I personally don’t believe in utopias, but I do believe in making the world better. The difficulty is that “better” means different things to different people, and this is something we can’t ever forget. To answer your question, I think that a society based on moderation and mutual respect/ tolerance for different beliefs is the best one. Canada’s multiculturalism policy comes to mind. There are many flaws with multiculturalism, as it certainly doesn’t guarantee that all social groups are treated fairly by those in power. However, having lived in Canada for some years, I find that this attempt at creating a multicultural society (where people are encouraged to maintain their cultural heritage and language) leads to a more diverse and interesting society than does the assimilationist attitude of the US (my home country) where there is greater pressure to give up old identities/values in order to fit in.
But, do most LW’ers think that it should be everyone’s position?
I won’t presume to speak for most LWers. Speaking for myself, I think we would all be better off if more people’s beliefs were more contingent on mutually observable events. So, yeah. I could be wrong, but I’d love to see the experiment done.
I don’t really think it would be possible to do an experiment here because the very definition of “better” is a question of values, and different people have different values.
And yet, there are many situations in which an observer does in fact look at two groups of people and claim that group A is better off than group B. On your view, are all such observers unjustified in all such claims, or are some of them sometimes justified? (And, if the latter, is there any reason we can’t affect the world so as to create such a situation, wherein we are justified in claiming that people are better off after our intervention than they were before?)
Well, there’s the anthropological concept of the psychic unity of humankind — we may have different values, but our ways of thinking (including our values) are not wholly alien from one another, but have a lot in common.
And there are also things we can say about human values that descend from cultural evolution: we would not expect, for instance, that any culture would exist that did not value its own replication into the next generation. So we would expect that people would want to teach their ideas to their children (or converts), merely because societies that don’t do that would tend to die out and we wouldn’t get to observe them.
But, do most LW’ers think that it should be everyone’s position?
Good question. I haven’t conducted a poll. But more problematically, what does that “should” mean?
It could mean:
“Would everyone be better off if they were more rationalist?” — I think yes, they would, because they would be better equipped to change the world in directions positive for themselves, and for humanity in general. And I think that this notion is pretty strong in the LW-community. Aside from problems such as becoming a clever arguer, we expect that greater rationality should generally help people.
“Is it worthwhile for me to try to get everyone to be more rationalist?” — It isn’t clear to me how much influence over other people’s rationality I can directly have; although I haven’t really tried outside of the LW-community and my (already rather rationalist-friendly) workplace yet. I intend to support CFAR’s educational program, though.
“Would I benefit from treating people as more virtuous, trustworthy, or righteous if they agree with my position regarding rationality than if they don’t?” — No, I don’t really think so. Doing that sort of thing seems more likely to lead to Blue/Green political nonsense than to any beneficial result. (Although it sure is nice to hang out with / be friends with / date people who share some metaphysics and reference points ….)
If none of these, what did you mean by “should” there?
The difficulty is that “better” means different things to different people, and this is something we can’t ever forget.
Sure; however, some of those different things are compatible and others aren’t. Politics shows up when we have to deal with the incompatible ones.
I’m predisposed to like multiculturalism in a lot of ways; it’s pretty, interesting, and yields a wide range of social forms — and cultural products such as food, music, and philosophy. It does pose some serious problems, though, when different cultures have incompatible views of things such as human rights, human dignity, or dispute resolution; or when it’s used as an excuse to restrain people from choosing to leave their local culture in favor of one of their choice; or when politically well-established cultures are valued highly above less well-connected ones.
Rationality in the LW-sense doesn’t presume to tell anyone what their values should be, except insofar as they shouldn’t be self-contradictory. We have a strong notion of the complexity of human value and a healthy suspicion of anyone who tries to simplify it. (A fellow came to my local LW meetup recently and tried to argue that the only value worth speaking of is personal survival. I think “wide-eyed horror” would fairly describe the general reaction that idea received.)
But there’s a large gap between complexity and irreducibility.
Thank you for re-clarifying this (yes, I was aware that this was the LW position). But, do most LW’ers think that it should be everyone’s position?
Heh, now there’s a question! I personally don’t believe in utopias, but I do believe in making the world better. The difficulty is that “better” means different things to different people, and this is something we can’t ever forget. To answer your question, I think that a society based on moderation and mutual respect/ tolerance for different beliefs is the best one. Canada’s multiculturalism policy comes to mind. There are many flaws with multiculturalism, as it certainly doesn’t guarantee that all social groups are treated fairly by those in power. However, having lived in Canada for some years, I find that this attempt at creating a multicultural society (where people are encouraged to maintain their cultural heritage and language) leads to a more diverse and interesting society than does the assimilationist attitude of the US (my home country) where there is greater pressure to give up old identities/values in order to fit in.
I won’t presume to speak for most LWers.
Speaking for myself, I think we would all be better off if more people’s beliefs were more contingent on mutually observable events. So, yeah.
I could be wrong, but I’d love to see the experiment done.
I don’t really think it would be possible to do an experiment here because the very definition of “better” is a question of values, and different people have different values.
And yet, there are many situations in which an observer does in fact look at two groups of people and claim that group A is better off than group B. On your view, are all such observers unjustified in all such claims, or are some of them sometimes justified? (And, if the latter, is there any reason we can’t affect the world so as to create such a situation, wherein we are justified in claiming that people are better off after our intervention than they were before?)
Well, there’s the anthropological concept of the psychic unity of humankind — we may have different values, but our ways of thinking (including our values) are not wholly alien from one another, but have a lot in common.
And there are also things we can say about human values that descend from cultural evolution: we would not expect, for instance, that any culture would exist that did not value its own replication into the next generation. So we would expect that people would want to teach their ideas to their children (or converts), merely because societies that don’t do that would tend to die out and we wouldn’t get to observe them.
Good question. I haven’t conducted a poll. But more problematically, what does that “should” mean?
It could mean:
“Would everyone be better off if they were more rationalist?” — I think yes, they would, because they would be better equipped to change the world in directions positive for themselves, and for humanity in general. And I think that this notion is pretty strong in the LW-community. Aside from problems such as becoming a clever arguer, we expect that greater rationality should generally help people.
“Is it worthwhile for me to try to get everyone to be more rationalist?” — It isn’t clear to me how much influence over other people’s rationality I can directly have; although I haven’t really tried outside of the LW-community and my (already rather rationalist-friendly) workplace yet. I intend to support CFAR’s educational program, though.
“Would I benefit from treating people as more virtuous, trustworthy, or righteous if they agree with my position regarding rationality than if they don’t?” — No, I don’t really think so. Doing that sort of thing seems more likely to lead to Blue/Green political nonsense than to any beneficial result. (Although it sure is nice to hang out with / be friends with / date people who share some metaphysics and reference points ….)
If none of these, what did you mean by “should” there?
Sure; however, some of those different things are compatible and others aren’t. Politics shows up when we have to deal with the incompatible ones.
I’m predisposed to like multiculturalism in a lot of ways; it’s pretty, interesting, and yields a wide range of social forms — and cultural products such as food, music, and philosophy. It does pose some serious problems, though, when different cultures have incompatible views of things such as human rights, human dignity, or dispute resolution; or when it’s used as an excuse to restrain people from choosing to leave their local culture in favor of one of their choice; or when politically well-established cultures are valued highly above less well-connected ones.
Rationality in the LW-sense doesn’t presume to tell anyone what their values should be, except insofar as they shouldn’t be self-contradictory. We have a strong notion of the complexity of human value and a healthy suspicion of anyone who tries to simplify it. (A fellow came to my local LW meetup recently and tried to argue that the only value worth speaking of is personal survival. I think “wide-eyed horror” would fairly describe the general reaction that idea received.)
But there’s a large gap between complexity and irreducibility.