Saying that people should not care about social dynamics and only about object level arguments is a failure at world modelling. People do care about social dynamics, if you want to win, you need to take that into account. If you think that people should act differently, well, you are right, but the people who counts are the real one, not those who live in your head.
Incentives matters. In today’s lesswrong, the threshold of quality for having your ideas heard (rather than everybody ganging up on you to explain how wrong you are) is much higher for people who disagree with Eliezer than for people who agree with him. Unsurprisingly, that means that people filter what they say at a higher rate if they disagree with Eliezer (or any other famous user honestly—including you.).
I wondered whether people would take away the message that “The social dynamics aren’t important.” I should have edited to clarify, so thanks for bringing this up.
Here was my intended message: The social dynamics are important, and it’s important to not let yourself be bullied around, and it’s important to make spaces where people aren’t pressured into conformity. But I find it productive to approach this situation with a mindset of “OK, whatever, this Eliezer guy made these claims, who cares what he thinks of me, are his claims actually correct?” This tactic doesn’t solve the social dynamics issues on LessWrong. This tactic just helps me think for myself.
So, to be clear, I agree that incentives matter, I agree that incentives are, in one way or another, bad around disagreeing with Eliezer (and, to lesser extents, with other prominent users). I infer that these bad incentives spring both from Eliezer’s condescension and rudeness, and also a range of other failures.
For example, if many people aren’t just doing their best to explain why they best-guess-of-the-facts agree with Eliezer—if those people are “ganging up” and rederiving the bottom line of “Eliezer has to be right”—then those people are failing at rationality,
or any other famous user honestly—including you.
For the record, I welcome any thoughtful commenter to disagree with me, for whatever small amount that reduces the anti-disagreement social pressure. I don’t negatively judge people who make good-faith efforts to disagree with me, even if I think their points are totally mistaken.
Saying that people should not care about social dynamics and only about object level arguments is a failure at world modelling. People do care about social dynamics, if you want to win, you need to take that into account. If you think that people should act differently, well, you are right, but the people who counts are the real one, not those who live in your head.
Incentives matters. In today’s lesswrong, the threshold of quality for having your ideas heard (rather than everybody ganging up on you to explain how wrong you are) is much higher for people who disagree with Eliezer than for people who agree with him. Unsurprisingly, that means that people filter what they say at a higher rate if they disagree with Eliezer (or any other famous user honestly—including you.).
I wondered whether people would take away the message that “The social dynamics aren’t important.” I should have edited to clarify, so thanks for bringing this up.
Here was my intended message: The social dynamics are important, and it’s important to not let yourself be bullied around, and it’s important to make spaces where people aren’t pressured into conformity. But I find it productive to approach this situation with a mindset of “OK, whatever, this Eliezer guy made these claims, who cares what he thinks of me, are his claims actually correct?” This tactic doesn’t solve the social dynamics issues on LessWrong. This tactic just helps me think for myself.
So, to be clear, I agree that incentives matter, I agree that incentives are, in one way or another, bad around disagreeing with Eliezer (and, to lesser extents, with other prominent users). I infer that these bad incentives spring both from Eliezer’s condescension and rudeness, and also a range of other failures.
For example, if many people aren’t just doing their best to explain why they best-guess-of-the-facts agree with Eliezer—if those people are “ganging up” and rederiving the bottom line of “Eliezer has to be right”—then those people are failing at rationality,
For the record, I welcome any thoughtful commenter to disagree with me, for whatever small amount that reduces the anti-disagreement social pressure. I don’t negatively judge people who make good-faith efforts to disagree with me, even if I think their points are totally mistaken.