For the last several years, I’ve known people who’ve submitted articles to the EA Forum or LessWrong, and found that the culture on these sites is pretty hostile to the kinds of views they’re presenting (different people have different opinions of the patterns of hostility to different views on the EA Forum and LW, respectively). What the particular views are doesn’t matter, because it’s been all kinds of views. What’s common between them is a perception an article they wrote they believe to be quite good was “poorly received” on the site, while still having a positive and significant number of upvotes. Now, none of the articles I’m thinking of had what I would call a high number of upvotes, but it was enough that at least several people had read the article, and a majority of them had upvoted the post. As a proportion of the people who read the article, it tells us a very significant minority, typically between 30-45%, disagreed with or disliked it enough to downvote it.
So, they’re articles which are not very well-received. Unless I’m missing something, an article having a positive number of upvotes should be interpreted as one’s article being at least somewhat well-received by the readers. If someone thinks that an article on the EA Forum or LW has received too many downvotes, or not enough upvotes, because there is something wrong with the general culture of the respective membership bases of these sites, that is one argument. Yet that is a different one than the arguments I see typically made by those who complain about the proportion of upvotes:downvotes they receive on the EA Forum or LessWrong. They just say that based on the appearance of not receiving strong, consistent, vocal support for their articles that they would like to have seen, that the reception to their article was overwhelmingly and uniformly negative. This is in spite of the fact they received at least a higher proportion of positive:negative feedback, even if only on the measure of karma. In other words, it’s my observation a lot of people who complain as if they’ve gotten uniformly, overwhelmingly, and inappropriately negative feedback on their articles have false impressions of how their articles were received, and are wrong.
The common tendency I see in articles like this is that there is often a high proportion of comments that disagree with or criticize the OP, and that these comments often receive more upvotes than the OP. So, it seems like what people are really upset about when their articles on the EA Forum or LW receive a merely a lukewarm reception, as opposed to an overwhelmingly negative one, is that, while there are at least a majority of the community who at least weakly supports them, there is a significant minority of the community who is more willing to strongly and vocally disagree with, criticize, or oppose them.
It seems to me one solution, to move to a better equilibrium of discussion, would be for users who agree with an original article, but are able to make the arguments for its thesis better than the original author, to write their own comments and articles that do so. There is a fear of politics among some rationalists, so that stigmatizes discussions that might appear to heighten tensions between different groups of people within effective altruism, and/or the rationality community. It’s my impression in these communities there is also a culture of letting an individuals words and ideas stand on their own, and not associating the arguments from just one individual on one side of a debate to everyone on that side of the debate. So, it strikes me as unlikely the vast majority of any effective altruists or LessWrongers would care enough to join an interpersonal/intercernine online disagreement to the point the community at large should have to concern ourselves about whether we should quell it.
Of course, one of the reasons LW and the EA Forum use karma scores as they do is so for the discourse on these fora to be shaped in a way satisfying to most users, without us all having to get bogged down in endless debates about the state of the discourse itself. At least that is what I’ve taken the karma systems of LW and the EA Forum in large part to be about, and why I have what I believe is an appropriate level of respect for them. They’re certainly not everything one should take into account for evaluating the quality of an article on LW or the EA Forum. Yet I don’t think they should definitely count for little, or even nothing, which is sometimes the reaction I see from EA Forum or LW members who aren’t satisfied with the proportion and kind of positive feedback their article receives, especially relative to the proportion and kind of negative feedback received.
These are my impressions as a long-time user of both the EA Forum and LessWrong. I was just wondering what other people’s thoughts on the subject are; whether they’re similar to, or different than, mine; and why.
For myself I find there is something really disheartening about presenting ideas in a post and then seeing it downvoted and then in the comments learning it was downvoted because a person disagreed with you because they didn’t like your conclusions or what they thought you wrote or what they fear was implied by what you wrote rather than that you did something that worked against the broader conversations we are trying to have around here. It’s very hard to think things through for yourself and say things that might be wrong, and I think it creates incentives that reduce exploration of ideas in favor of refinement of existing ideas (basically a move toward scholasticism) when voting causes people to feel punished when they do that.
As you note, it’s especially frustrating when a comment on your post gets higher votes than your post did and that comment is a poorly reasoned objection to your post or a refutation against a strawman versions of what you said. For me this often happens by my attempting to lay out some complex, nuanced idea that was going to be difficult to explain no matter how I did it, and then someone objecting to what I perceive to be a simplified, rough version of it. And when that gets a lot of upvotes it feels like a strong signal that all attempts to say things that aren’t simple extensions of what people already believe will be rejected (I especially feel this way because my recollection is that the worst of these cases usually involve hitting rationalist applause lights and then getting a lot of “applause” for doing so without actually saying much of anything).
None of this is to deny that I couldn’t be a better writer or have better ideas or that I don’t want critical engagement with my writing, only to say that it stings when you see voting patterns that feel more like boos and yays than voting patterns that feel like some recognition of what is most worth engaging with on the site. I would be pretty happy if someone commented on my posts raising well thought out objections or asking clarifying questions that lead me to realize I was mistaken and that got a lot of upvotes, rather than something getting a lot of upvotes that feels like it’s just scoring points against me and doesn’t really try to engage me or my ideas. I suspect that it’s only thanks to my now strong psychological resilience that I keep on posting on LW, and I worry about who else is being silenced because they don’t want to subject themselves to the harsh judgement of the crowd.
If you never get downvoted, you’re not being contrarian enough.
I sort of agree, but this tends to be holding the system of voting stable and assumes you’re making tradeoffs along the efficiency frontier. There are probably ways to pull the voting system sideways such that you probably can optimize for more of what you care about that’s currently being captured in this notion by “contrarian”ism that exists as a result of compressing ourselves down into a simple, generalized up/down vote system.
This is still good advice, though, with respect to the current system.
Yeah. Upvotes/downvotes act as reward/punishment respectively. So the problem with voting to express agreement/disagreement is that you are rewarding people for expressing common views and punishing them for expressing uncommon views. Which can lead to an echo chamber.
But it’s still valuable to know whether people agree or disagree! So I suspect the ideal voting system would separate out the “more of this”/”less of this” axis from the “agree”/”disagree” axis. You could have people fill out text boxes anonymously to explain their “more of this”/”less of this” votes, then do text clustering once you had enough filled-out text boxes, then figure out the top 10 reasons people choose “more of this”/”less of this” and replace the text boxes with dropdowns. To guard against misuse, you could weight dropdown selections from users who tend to agree with trusted moderators more heavily.
For me, accumulated karma is mostly an indicator of how long someone’s been here and how much they’ve participated. Common use seems to be mostly upvote; downvotes aren’t rare, but a pretty neutral comment is likely to get 2-10 karma, and only a pretty bad one gets into the negative range. And posters who routinely get downvoted (for whatever reasons) likely either change or leave, so there’s a strong selection toward an expectation of more upvotes than downvotes.
I find karma changes for a comment I make is somewhat useful—mostly it indicates how popular is the post I’m commenting on, but secondarily it gives me a sense of whether I’m commenting on the points that most readers find salient in the post.
I’ll admit that votes carry more emotional weight than I want them to—I know they’re meaningless internet points, and a rather noisy signal of popularity, but it still feels nice when something gets more upvotes than normal, and hurts a bit when I’m downvoted.
You make a good point I forgot to add: the function karma on an article or comment serves in providing info to other users, as opposed to just the submitting user. That’s something people should keep in mind.