Actually, the reason for that title was because of a point I decided to leave out, but may as well spell out here: “Deciding to talk about politics, even though this may cause you to lose some of your audience” and “Deciding to tell people they’re wrong, even though this may cause you to lose some of your audience” are both tradeoffs, and it’s odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).
I suspect the reason for this mostly has to do with Eliezer thinking politics are not very important, but also thinking that, say, telling certain people their AI projects are dangerously stupid is very important. But not everyone agrees, and the anti-politics norm is itself a barrier to talking about how important politics are. (Personally, I suspect government action will be important for the future of AI in large part because I expect large organizations in general to be important for the future of AI.)
“Deciding to talk about politics, even though this may cause you to lose some of your audience” and “Deciding to tell people they’re wrong, even though this may cause you to lose some of your audience” are both tradeoffs, and it’s odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).
Yeah, I saw the parallel there. I more or less think that both talking about politics and explicitly telling people that they’re wrong are usually undesirable and that LessWrong should do neither.
I also agree with you that government action could be important for the future of AI.
it’s odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).
Telling people they are wrong is almost explicitly about rationality, but we should definitely think about how to do that. If I’m wrong, I want to know that and there’s a clear benefit in people telling me that.
I don’t see any clear benefit in discussing politics here, so I’m not even sure what the tradeoff is. It’s not that politics are not important, but that there’s not much we can do about them.
I’d be very interested in a post explaining why discussing politics is more important than other things, not why politics is important, for this rather small rationalist community.
but also thinking that, say, telling certain people their AI projects are dangerously stupid is very important.
I’m not sure he has bluntly told that to anyone’s face. I think he’s saying these things to educate his audience, not to change his opponents’ minds.
Personally, I suspect government action will be important for the future of AI
This I might agree with but it doesn’t justify talking about other political topics. This particular topic also wouldn’t be a mind killer because it’s not controversial here and any policies regarding it are still distant hypotheticals.
I see. I’d rather suspect that person wasn’t all that important, nor was the audience at that dinner party, but maybe that’s just wishful thinking. I also suspect he’s learned some social skills over the years.
I’d rather suspect that person wasn’t all that important, nor was the audience at that dinner party, but maybe that’s just wishful thinking.
In the comments, he makes clear he held the “losing an argument is a good thing, it’s on you if you fail to take advantage of it” position. He may no longer feel that way.
Actually, the reason for that title was because of a point I decided to leave out, but may as well spell out here: “Deciding to talk about politics, even though this may cause you to lose some of your audience” and “Deciding to tell people they’re wrong, even though this may cause you to lose some of your audience” are both tradeoffs, and it’s odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).
I have had experience as a moderator at a science forum, and I can tell you that almost all of our moderating involved either A) the politics subforum, or B) indirect religious arguments, especially concerning evolution (the religion subforum was banned before my time due to impossibly high need for moderation). The rest was mostly the better trolls and people getting frustrated when someone wouldn’t change their mind on an obvious thing.
However, I must say I don’t see how people can discuss rationality and how people fail at it without someone telling someone else that they’re wrong. After all, the major aspect of rationality is distinguishing correct from incorrect.
Incidentally, I’ve been really impressed at the quality of comments and users on this site. Consider what this user has observed about LW before you complain about how politics is not allowed.
If you read the “I agree” as sarcastic, then it looks like the right meta level. (I’m not sure it’s a good thing I thought that was more plausible than the accident hypothesis when I first parsed that sentence.)
Great post. This sort of perspective is something that I’d definitely like to see more of on LessWrong.
Thanks. You may be interested to know that I originally considered titling this post “Being Told You’re Wrong Is the Mind-Killer.”
Personally, I’m glad you decided not to.
I agree, mind-killer is too much of an applause light is an applause light these days.
Actually, the reason for that title was because of a point I decided to leave out, but may as well spell out here: “Deciding to talk about politics, even though this may cause you to lose some of your audience” and “Deciding to tell people they’re wrong, even though this may cause you to lose some of your audience” are both tradeoffs, and it’s odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).
I suspect the reason for this mostly has to do with Eliezer thinking politics are not very important, but also thinking that, say, telling certain people their AI projects are dangerously stupid is very important. But not everyone agrees, and the anti-politics norm is itself a barrier to talking about how important politics are. (Personally, I suspect government action will be important for the future of AI in large part because I expect large organizations in general to be important for the future of AI.)
Yeah, I saw the parallel there. I more or less think that both talking about politics and explicitly telling people that they’re wrong are usually undesirable and that LessWrong should do neither.
I also agree with you that government action could be important for the future of AI.
Telling people they are wrong is almost explicitly about rationality, but we should definitely think about how to do that. If I’m wrong, I want to know that and there’s a clear benefit in people telling me that.
I don’t see any clear benefit in discussing politics here, so I’m not even sure what the tradeoff is. It’s not that politics are not important, but that there’s not much we can do about them.
I’d be very interested in a post explaining why discussing politics is more important than other things, not why politics is important, for this rather small rationalist community.
I’m not sure he has bluntly told that to anyone’s face. I think he’s saying these things to educate his audience, not to change his opponents’ minds.
This I might agree with but it doesn’t justify talking about other political topics. This particular topic also wouldn’t be a mind killer because it’s not controversial here and any policies regarding it are still distant hypotheticals.
Well...
I see. I’d rather suspect that person wasn’t all that important, nor was the audience at that dinner party, but maybe that’s just wishful thinking. I also suspect he’s learned some social skills over the years.
In the comments, he makes clear he held the “losing an argument is a good thing, it’s on you if you fail to take advantage of it” position. He may no longer feel that way.
I see :)
I have had experience as a moderator at a science forum, and I can tell you that almost all of our moderating involved either A) the politics subforum, or B) indirect religious arguments, especially concerning evolution (the religion subforum was banned before my time due to impossibly high need for moderation). The rest was mostly the better trolls and people getting frustrated when someone wouldn’t change their mind on an obvious thing.
However, I must say I don’t see how people can discuss rationality and how people fail at it without someone telling someone else that they’re wrong. After all, the major aspect of rationality is distinguishing correct from incorrect.
Incidentally, I’ve been really impressed at the quality of comments and users on this site. Consider what this user has observed about LW before you complain about how politics is not allowed.
I think you accidentally went up one meta level.
If you read the “I agree” as sarcastic, then it looks like the right meta level. (I’m not sure it’s a good thing I thought that was more plausible than the accident hypothesis when I first parsed that sentence.)
Not sarcasm, although now that you mentioned it I can definitely see it, just well intentioned humour. See the other comment :)
As I was writing the comment, I realized applause light is an applause light too, so I decided to make fun of that.