But, do most LW’ers think that it should be everyone’s position?
Good question. I haven’t conducted a poll. But more problematically, what does that “should” mean?
It could mean:
“Would everyone be better off if they were more rationalist?” — I think yes, they would, because they would be better equipped to change the world in directions positive for themselves, and for humanity in general. And I think that this notion is pretty strong in the LW-community. Aside from problems such as becoming a clever arguer, we expect that greater rationality should generally help people.
“Is it worthwhile for me to try to get everyone to be more rationalist?” — It isn’t clear to me how much influence over other people’s rationality I can directly have; although I haven’t really tried outside of the LW-community and my (already rather rationalist-friendly) workplace yet. I intend to support CFAR’s educational program, though.
“Would I benefit from treating people as more virtuous, trustworthy, or righteous if they agree with my position regarding rationality than if they don’t?” — No, I don’t really think so. Doing that sort of thing seems more likely to lead to Blue/Green political nonsense than to any beneficial result. (Although it sure is nice to hang out with / be friends with / date people who share some metaphysics and reference points ….)
If none of these, what did you mean by “should” there?
The difficulty is that “better” means different things to different people, and this is something we can’t ever forget.
Sure; however, some of those different things are compatible and others aren’t. Politics shows up when we have to deal with the incompatible ones.
I’m predisposed to like multiculturalism in a lot of ways; it’s pretty, interesting, and yields a wide range of social forms — and cultural products such as food, music, and philosophy. It does pose some serious problems, though, when different cultures have incompatible views of things such as human rights, human dignity, or dispute resolution; or when it’s used as an excuse to restrain people from choosing to leave their local culture in favor of one of their choice; or when politically well-established cultures are valued highly above less well-connected ones.
Rationality in the LW-sense doesn’t presume to tell anyone what their values should be, except insofar as they shouldn’t be self-contradictory. We have a strong notion of the complexity of human value and a healthy suspicion of anyone who tries to simplify it. (A fellow came to my local LW meetup recently and tried to argue that the only value worth speaking of is personal survival. I think “wide-eyed horror” would fairly describe the general reaction that idea received.)
But there’s a large gap between complexity and irreducibility.
Good question. I haven’t conducted a poll. But more problematically, what does that “should” mean?
It could mean:
“Would everyone be better off if they were more rationalist?” — I think yes, they would, because they would be better equipped to change the world in directions positive for themselves, and for humanity in general. And I think that this notion is pretty strong in the LW-community. Aside from problems such as becoming a clever arguer, we expect that greater rationality should generally help people.
“Is it worthwhile for me to try to get everyone to be more rationalist?” — It isn’t clear to me how much influence over other people’s rationality I can directly have; although I haven’t really tried outside of the LW-community and my (already rather rationalist-friendly) workplace yet. I intend to support CFAR’s educational program, though.
“Would I benefit from treating people as more virtuous, trustworthy, or righteous if they agree with my position regarding rationality than if they don’t?” — No, I don’t really think so. Doing that sort of thing seems more likely to lead to Blue/Green political nonsense than to any beneficial result. (Although it sure is nice to hang out with / be friends with / date people who share some metaphysics and reference points ….)
If none of these, what did you mean by “should” there?
Sure; however, some of those different things are compatible and others aren’t. Politics shows up when we have to deal with the incompatible ones.
I’m predisposed to like multiculturalism in a lot of ways; it’s pretty, interesting, and yields a wide range of social forms — and cultural products such as food, music, and philosophy. It does pose some serious problems, though, when different cultures have incompatible views of things such as human rights, human dignity, or dispute resolution; or when it’s used as an excuse to restrain people from choosing to leave their local culture in favor of one of their choice; or when politically well-established cultures are valued highly above less well-connected ones.
Rationality in the LW-sense doesn’t presume to tell anyone what their values should be, except insofar as they shouldn’t be self-contradictory. We have a strong notion of the complexity of human value and a healthy suspicion of anyone who tries to simplify it. (A fellow came to my local LW meetup recently and tried to argue that the only value worth speaking of is personal survival. I think “wide-eyed horror” would fairly describe the general reaction that idea received.)
But there’s a large gap between complexity and irreducibility.