But Eliezer has me swayed on that one. Now I’ll downvote, even though I am, indeed, very uncertain of my own ability to correctly judge whether a post deserves to be downvoted or not.
I disagree, I don’t think you should downvote what you don’t understand. This will only pull the discussion to the level of the least competent people.
if people downvote what they don’t understand, and it’s a good comment, then it should have more upvotes than downvotes if most people understand it. If it has more downvotes than upvotes in this scenario, then it was not explained well enough for the majority of readers.
These are generalizations, of course, and depend largely on actual voting habits. But so was the note that it will pull the discussion to the level of the ‘least competent people’ - possibly the same observation could be stated as pulling the discussion to the level of the majority of the readership.
That was my first idea. But I am not the only player here. I know I overcompensate for my uncertainty, and so I tend to never downvote anything. Other people may not have the same attitude, for down, and upvoting. Who are they ? Is their opinion more educated than mine ? If we all are too scrupulous to vote when our opinion is in fact precious, then our occasional vote may end up drowned in a sea of poorly decided, hastily cast ones.
Besides, I am still only going to downvote if I can think of a good reason to do so. For sometimes, I have a good reason to downvote, but no still no good reasons, or even no time, to reply to all ideas I think need a fix, or those which are simply irrelevant to the current debate.
You are trying to fight fools with your intuition. How much confidence do you place in it? Is your intuition more informed than the decisions of average voters? Hard to say, I wouldn’t be so sure in this compound statement. It only becomes clear where you know yourself to be competent or ignorant, above or below the “average voter”. At least abstaining from voting has clear semantics, you don’t introduce your judgment at all. On the other hand, in many cases it should be easy to recognize poor quality.
I don’t place any confidence in my intuition as a general, indiscriminately good-for-everything case. I try to only have confidence on a case by case basis. I try to pay attention to all potential bias that could screw my opinion, like anchoring. And try to not pay attention to who wrote what I’m voting upon. Then I have to have a counterargument. Even if I don’t elaborate it, even if I don’t lay it down, I have to know that if I had the time or motivation, I could rather reply, and say what was wrong or right in that post.
My decisions and arguments, could, or could not be more informed than those of the average voter. But if I add my own in the pool of votes, then we have a new average. Which will only be slightly worse, or slightly better. Could we try to adapt something of decision markets there ? The way they’re supposed to self correct, under the right conditions, makes me wonder if we could dig a solution in them.
And maybe someone could create an article, collecting all the stuff that could help people make more informed votes on LW, that’d help too. Like the biases they’d have to take into account, stuff like the antikibitzer, or links to articles such as the one about aumann voting or this very one.
I disagree, I don’t think you should downvote what you don’t understand. This will only pull the discussion to the level of the least competent people.
if people downvote what they don’t understand, and it’s a good comment, then it should have more upvotes than downvotes if most people understand it. If it has more downvotes than upvotes in this scenario, then it was not explained well enough for the majority of readers.
These are generalizations, of course, and depend largely on actual voting habits. But so was the note that it will pull the discussion to the level of the ‘least competent people’ - possibly the same observation could be stated as pulling the discussion to the level of the majority of the readership.
That was my first idea. But I am not the only player here. I know I overcompensate for my uncertainty, and so I tend to never downvote anything. Other people may not have the same attitude, for down, and upvoting. Who are they ? Is their opinion more educated than mine ? If we all are too scrupulous to vote when our opinion is in fact precious, then our occasional vote may end up drowned in a sea of poorly decided, hastily cast ones.
Besides, I am still only going to downvote if I can think of a good reason to do so. For sometimes, I have a good reason to downvote, but no still no good reasons, or even no time, to reply to all ideas I think need a fix, or those which are simply irrelevant to the current debate.
You are trying to fight fools with your intuition. How much confidence do you place in it? Is your intuition more informed than the decisions of average voters? Hard to say, I wouldn’t be so sure in this compound statement. It only becomes clear where you know yourself to be competent or ignorant, above or below the “average voter”. At least abstaining from voting has clear semantics, you don’t introduce your judgment at all. On the other hand, in many cases it should be easy to recognize poor quality.
I don’t place any confidence in my intuition as a general, indiscriminately good-for-everything case. I try to only have confidence on a case by case basis. I try to pay attention to all potential bias that could screw my opinion, like anchoring. And try to not pay attention to who wrote what I’m voting upon. Then I have to have a counterargument. Even if I don’t elaborate it, even if I don’t lay it down, I have to know that if I had the time or motivation, I could rather reply, and say what was wrong or right in that post.
My decisions and arguments, could, or could not be more informed than those of the average voter. But if I add my own in the pool of votes, then we have a new average. Which will only be slightly worse, or slightly better. Could we try to adapt something of decision markets there ? The way they’re supposed to self correct, under the right conditions, makes me wonder if we could dig a solution in them.
And maybe someone could create an article, collecting all the stuff that could help people make more informed votes on LW, that’d help too. Like the biases they’d have to take into account, stuff like the antikibitzer, or links to articles such as the one about aumann voting or this very one.