I almost don’t dare to say this, but I think the downvoting is flawed because there is nothing to prevent politically incorrect comments to be downvoted, regardless of their validity. To give one concrete example consider this comment:
All it provides is a link to a video with eye-witness testimony against a claim made on the corresponding post and it is relentlessly downvoted(-17 as of last counting). In a website devoted to rationality shouldn’t it be a rule that you can’t downvote evidence?
That comment was downvoted for engaging in a tired debate which had already been resolved to everyone else’s satisfaction. The prior for “The U.S. government will intentionally murder thousands of its own citizens in a fake terrorist attack” is very low; the abundant evidence (including hundreds of eyewitnesses) that, y’know, planes crashed into the towers makes it even more unlikely. One piece of cherry-picked evidence against the obvious facts was obviously insufficient to change anyone’s opinion (furthermore, it wasn’t even the point of the post), so it was downvoted.
For the same reason, I downvote “evidence” presented for various religions. The issue has been decided; no given piece of evidence is going to change anyone’s opinion, and rehashing the issue is a waste of everyone’s time. Political incorrectness has nothing to do with it.
In a website devoted to rationality shouldn’t it be a rule that you can’t downvote evidence?
No.
A comment might, for example, present evidence in a misleading way (e.g. by being selective in presenting evidence or by describing evidence in a way that nudges readers toward a particular false conclusion). I’d be against a rule against downvoting comments like that.
You raise a good point. But then who will judge when evidence should be downvoted or not? There is probably no easy answer to that. My point is: the voting system is flawed and maybe we should rethink it or at least put certain safeguards in place.
I doubt that being politically incorrect was the main reason that comment got downvoted.
For example, people might have believed for their own reasons that your comment was invalid. Or maybe they thought that since the topic often leads to dead ends, the comment should be downvoted as if you knew this and still headed into the dead end. People also may have interpreted you as trying to actually change peoples’ minds, and thus felt disapproving or even insulted when you expected their minds to be changed by a small amount of evidence compared to that already available.
And what proportion of the downvotes came in after you added, essentially, “I’m so disappointed in you all for downvoting this.” 50%?
And what proportion of the downvotes came in after you added, essentially, “I’m so disappointed in you all for downvoting this.” 50%?
Believe me or not, but I’m trying to work on debiasing this site. I don’t know how I can raise attention to the issue in a better way than to point to one specific example where a lot of people here including top posters have gone terribly wrong.
I don’t know how I can raise attention to the issue in a better way than to point to one specific example where a lot of people here including top posters have gone terribly wrong.
In all this discussion it might help to ask yourself “there are a lot of smart people who disagree with me. Maybe they haven’t really gone terribly wrong. Maybe I’m wrong.”
The notion that top posters could be extremely wrong is not by itself implausible. I can without much effort think of multiple points where I strongly disagree with a variety of top contributors. Moreover, I can point out examples where top contributors apparently don’t agree.
But that’s not the same situation as this. This is every single major contributor and a lot of other people aside disagreeing with you.
There are occasions when arrogance in the face of popular opinion is healthy. But whenever one is in a situation that seems like one of those one should ask whether or not one is really in that sort of situation. Take the outside view for a minute. Even if after thinking about it, one decides “nope, they really are that wrong” it might make sense simply as a matter of rhetoric/Dark Arts/getting-people-to-maybe-listen to not act like one is so sure of one’s self.
It does seem that your repeated activity on this matter on LW is not being helpful. So even if the above advice is not useful, it may still make sense to consider switching to a different tactic. Two obvious tactics are to make specific predictions about the actual world that would be likely to be different if the federal government was involved in 9/11. I’ve added to PredictionBook four predictions related to this as examples: 1, 23, 4. PredictionBook is of course the easy way to do this, there’s nothing at stake. The other tactic is to make actual bets over such predictions.
I am willing to take any of those predictions and make a bet with you over them in the range of a few hundred dollars. I’m also willing to negotiate a LongBet over them. If none of these predictions fit then we can maybe discuss other possible bets along similar lines. This is precisely the sort of thing that will get us to actually listen: show that your beliefs pay rent, and even better make a prediction that turns out to be correct that the standard model of things will predict is very unlikely. That’s a way to get us to sit up and take notice.
So the hypothesis is that if a group of people have a belief because of the absurdity heuristic rather than the evidence, if you show them one piece of evidence that goes against their belief they’ll rationalize something like “but what about all the other evidence?”
But if a group of people have a belief because of the evidence, and you show them one piece of evidence that goes against their belief, won’t they also say “but what about all the other evidence?”
This is problematic, since it means that “confronting” people is a bad way to get information about them, even to give them information about themselves. Rather than zigging or zagging, try zogging. What sort of tests can you think of that differentiate well between people who are thoroughly biased and people who used evidence?
Well, that’s still the “confronting” test. Given that people answered “because of the other evidence” in various places, either you’re wrong about people deciding irrationally, or people are rationalizing a lot (which would make it a non-discriminating test). What sort of test would discriminate between a rational-ish person and someone who originally chose because of some bias (“bias X”) and then rationalized, without requiring examination of the annotated bibliography of all the evidence someone ever considered ever?
The video may be tiny, tiny, tiny evidence against the claim made. The fact that this was the video someone chose as evidence against the claim is a good deal more significant, and is evidence for the claim that overwhelms the tiny, tiny evidence.
Yet it still is a waste of time to present small evidence in favor of conclusions people already believe, even more so than it would be to bring infinitesimally small evidence against such claims.
It’s −19 now, after my own downvote. I’ve not yet seen that Youtube -- I downvoted it just for the obsession with counting your downvotes. 3/4ths of that post are you complaining about the downvotes. Sheesh.
And since you ask people in the same post NOT to upvote it, I think it will keep going down, as it has no way of going up.
I almost don’t dare to say this, but I think the downvoting is flawed because there is nothing to prevent politically incorrect comments to be downvoted, regardless of their validity. To give one concrete example consider this comment:
http://lesswrong.com/lw/1ww/undiscriminating_skepticism/1r5v
All it provides is a link to a video with eye-witness testimony against a claim made on the corresponding post and it is relentlessly downvoted(-17 as of last counting). In a website devoted to rationality shouldn’t it be a rule that you can’t downvote evidence?
That comment was downvoted for engaging in a tired debate which had already been resolved to everyone else’s satisfaction. The prior for “The U.S. government will intentionally murder thousands of its own citizens in a fake terrorist attack” is very low; the abundant evidence (including hundreds of eyewitnesses) that, y’know, planes crashed into the towers makes it even more unlikely. One piece of cherry-picked evidence against the obvious facts was obviously insufficient to change anyone’s opinion (furthermore, it wasn’t even the point of the post), so it was downvoted.
For the same reason, I downvote “evidence” presented for various religions. The issue has been decided; no given piece of evidence is going to change anyone’s opinion, and rehashing the issue is a waste of everyone’s time. Political incorrectness has nothing to do with it.
No.
A comment might, for example, present evidence in a misleading way (e.g. by being selective in presenting evidence or by describing evidence in a way that nudges readers toward a particular false conclusion). I’d be against a rule against downvoting comments like that.
You raise a good point. But then who will judge when evidence should be downvoted or not? There is probably no easy answer to that. My point is: the voting system is flawed and maybe we should rethink it or at least put certain safeguards in place.
How about the people doing the downvotes?
Brilliant comment.
This is quite vague. Do you have any specific safeguards or other suggestions in mind?
I doubt that being politically incorrect was the main reason that comment got downvoted.
For example, people might have believed for their own reasons that your comment was invalid. Or maybe they thought that since the topic often leads to dead ends, the comment should be downvoted as if you knew this and still headed into the dead end. People also may have interpreted you as trying to actually change peoples’ minds, and thus felt disapproving or even insulted when you expected their minds to be changed by a small amount of evidence compared to that already available.
And what proportion of the downvotes came in after you added, essentially, “I’m so disappointed in you all for downvoting this.” 50%?
Believe me or not, but I’m trying to work on debiasing this site. I don’t know how I can raise attention to the issue in a better way than to point to one specific example where a lot of people here including top posters have gone terribly wrong.
In all this discussion it might help to ask yourself “there are a lot of smart people who disagree with me. Maybe they haven’t really gone terribly wrong. Maybe I’m wrong.”
The notion that top posters could be extremely wrong is not by itself implausible. I can without much effort think of multiple points where I strongly disagree with a variety of top contributors. Moreover, I can point out examples where top contributors apparently don’t agree.
But that’s not the same situation as this. This is every single major contributor and a lot of other people aside disagreeing with you.
There are occasions when arrogance in the face of popular opinion is healthy. But whenever one is in a situation that seems like one of those one should ask whether or not one is really in that sort of situation. Take the outside view for a minute. Even if after thinking about it, one decides “nope, they really are that wrong” it might make sense simply as a matter of rhetoric/Dark Arts/getting-people-to-maybe-listen to not act like one is so sure of one’s self.
It does seem that your repeated activity on this matter on LW is not being helpful. So even if the above advice is not useful, it may still make sense to consider switching to a different tactic. Two obvious tactics are to make specific predictions about the actual world that would be likely to be different if the federal government was involved in 9/11. I’ve added to PredictionBook four predictions related to this as examples: 1, 2 3, 4. PredictionBook is of course the easy way to do this, there’s nothing at stake. The other tactic is to make actual bets over such predictions.
I am willing to take any of those predictions and make a bet with you over them in the range of a few hundred dollars. I’m also willing to negotiate a LongBet over them. If none of these predictions fit then we can maybe discuss other possible bets along similar lines. This is precisely the sort of thing that will get us to actually listen: show that your beliefs pay rent, and even better make a prediction that turns out to be correct that the standard model of things will predict is very unlikely. That’s a way to get us to sit up and take notice.
So the hypothesis is that if a group of people have a belief because of the absurdity heuristic rather than the evidence, if you show them one piece of evidence that goes against their belief they’ll rationalize something like “but what about all the other evidence?”
But if a group of people have a belief because of the evidence, and you show them one piece of evidence that goes against their belief, won’t they also say “but what about all the other evidence?”
This is problematic, since it means that “confronting” people is a bad way to get information about them, even to give them information about themselves. Rather than zigging or zagging, try zogging. What sort of tests can you think of that differentiate well between people who are thoroughly biased and people who used evidence?
First of all, thanks for the constructive argument!
One that I have thought of long ago is asking the basic question of rationality, “Why do you believe what you believe.” The result can be seen here: http://lesswrong.com/lw/5kz/the_5second_level/4c68 and here: http://lesswrong.com/lw/1ww/undiscriminating_skepticism/4c63
Needless to say, both questions were also ignored.
I don’t know what other tests could be performed, considering that the people in question are apparently not willing to participate in any.
Well, that’s still the “confronting” test. Given that people answered “because of the other evidence” in various places, either you’re wrong about people deciding irrationally, or people are rationalizing a lot (which would make it a non-discriminating test). What sort of test would discriminate between a rational-ish person and someone who originally chose because of some bias (“bias X”) and then rationalized, without requiring examination of the annotated bibliography of all the evidence someone ever considered ever?
I don’t know, do you have any suggestion?
It is a waste of time.
The video may be tiny, tiny, tiny evidence against the claim made. The fact that this was the video someone chose as evidence against the claim is a good deal more significant, and is evidence for the claim that overwhelms the tiny, tiny evidence.
Yet it still is a waste of time to present small evidence in favor of conclusions people already believe, even more so than it would be to bring infinitesimally small evidence against such claims.
http://lesswrong.com/lw/5kz/the_5second_level/44im
It’s −19 now, after my own downvote. I’ve not yet seen that Youtube -- I downvoted it just for the obsession with counting your downvotes. 3/4ths of that post are you complaining about the downvotes. Sheesh.
And since you ask people in the same post NOT to upvote it, I think it will keep going down, as it has no way of going up.
I fixed that. I meant people who already downvoted it not to later upvote it.
I’m not complaining, just collecting data for the historical record.