I doubt that being politically incorrect was the main reason that comment got downvoted.
For example, people might have believed for their own reasons that your comment was invalid. Or maybe they thought that since the topic often leads to dead ends, the comment should be downvoted as if you knew this and still headed into the dead end. People also may have interpreted you as trying to actually change peoples’ minds, and thus felt disapproving or even insulted when you expected their minds to be changed by a small amount of evidence compared to that already available.
And what proportion of the downvotes came in after you added, essentially, “I’m so disappointed in you all for downvoting this.” 50%?
And what proportion of the downvotes came in after you added, essentially, “I’m so disappointed in you all for downvoting this.” 50%?
Believe me or not, but I’m trying to work on debiasing this site. I don’t know how I can raise attention to the issue in a better way than to point to one specific example where a lot of people here including top posters have gone terribly wrong.
I don’t know how I can raise attention to the issue in a better way than to point to one specific example where a lot of people here including top posters have gone terribly wrong.
In all this discussion it might help to ask yourself “there are a lot of smart people who disagree with me. Maybe they haven’t really gone terribly wrong. Maybe I’m wrong.”
The notion that top posters could be extremely wrong is not by itself implausible. I can without much effort think of multiple points where I strongly disagree with a variety of top contributors. Moreover, I can point out examples where top contributors apparently don’t agree.
But that’s not the same situation as this. This is every single major contributor and a lot of other people aside disagreeing with you.
There are occasions when arrogance in the face of popular opinion is healthy. But whenever one is in a situation that seems like one of those one should ask whether or not one is really in that sort of situation. Take the outside view for a minute. Even if after thinking about it, one decides “nope, they really are that wrong” it might make sense simply as a matter of rhetoric/Dark Arts/getting-people-to-maybe-listen to not act like one is so sure of one’s self.
It does seem that your repeated activity on this matter on LW is not being helpful. So even if the above advice is not useful, it may still make sense to consider switching to a different tactic. Two obvious tactics are to make specific predictions about the actual world that would be likely to be different if the federal government was involved in 9/11. I’ve added to PredictionBook four predictions related to this as examples: 1, 23, 4. PredictionBook is of course the easy way to do this, there’s nothing at stake. The other tactic is to make actual bets over such predictions.
I am willing to take any of those predictions and make a bet with you over them in the range of a few hundred dollars. I’m also willing to negotiate a LongBet over them. If none of these predictions fit then we can maybe discuss other possible bets along similar lines. This is precisely the sort of thing that will get us to actually listen: show that your beliefs pay rent, and even better make a prediction that turns out to be correct that the standard model of things will predict is very unlikely. That’s a way to get us to sit up and take notice.
So the hypothesis is that if a group of people have a belief because of the absurdity heuristic rather than the evidence, if you show them one piece of evidence that goes against their belief they’ll rationalize something like “but what about all the other evidence?”
But if a group of people have a belief because of the evidence, and you show them one piece of evidence that goes against their belief, won’t they also say “but what about all the other evidence?”
This is problematic, since it means that “confronting” people is a bad way to get information about them, even to give them information about themselves. Rather than zigging or zagging, try zogging. What sort of tests can you think of that differentiate well between people who are thoroughly biased and people who used evidence?
Well, that’s still the “confronting” test. Given that people answered “because of the other evidence” in various places, either you’re wrong about people deciding irrationally, or people are rationalizing a lot (which would make it a non-discriminating test). What sort of test would discriminate between a rational-ish person and someone who originally chose because of some bias (“bias X”) and then rationalized, without requiring examination of the annotated bibliography of all the evidence someone ever considered ever?
I doubt that being politically incorrect was the main reason that comment got downvoted.
For example, people might have believed for their own reasons that your comment was invalid. Or maybe they thought that since the topic often leads to dead ends, the comment should be downvoted as if you knew this and still headed into the dead end. People also may have interpreted you as trying to actually change peoples’ minds, and thus felt disapproving or even insulted when you expected their minds to be changed by a small amount of evidence compared to that already available.
And what proportion of the downvotes came in after you added, essentially, “I’m so disappointed in you all for downvoting this.” 50%?
Believe me or not, but I’m trying to work on debiasing this site. I don’t know how I can raise attention to the issue in a better way than to point to one specific example where a lot of people here including top posters have gone terribly wrong.
In all this discussion it might help to ask yourself “there are a lot of smart people who disagree with me. Maybe they haven’t really gone terribly wrong. Maybe I’m wrong.”
The notion that top posters could be extremely wrong is not by itself implausible. I can without much effort think of multiple points where I strongly disagree with a variety of top contributors. Moreover, I can point out examples where top contributors apparently don’t agree.
But that’s not the same situation as this. This is every single major contributor and a lot of other people aside disagreeing with you.
There are occasions when arrogance in the face of popular opinion is healthy. But whenever one is in a situation that seems like one of those one should ask whether or not one is really in that sort of situation. Take the outside view for a minute. Even if after thinking about it, one decides “nope, they really are that wrong” it might make sense simply as a matter of rhetoric/Dark Arts/getting-people-to-maybe-listen to not act like one is so sure of one’s self.
It does seem that your repeated activity on this matter on LW is not being helpful. So even if the above advice is not useful, it may still make sense to consider switching to a different tactic. Two obvious tactics are to make specific predictions about the actual world that would be likely to be different if the federal government was involved in 9/11. I’ve added to PredictionBook four predictions related to this as examples: 1, 2 3, 4. PredictionBook is of course the easy way to do this, there’s nothing at stake. The other tactic is to make actual bets over such predictions.
I am willing to take any of those predictions and make a bet with you over them in the range of a few hundred dollars. I’m also willing to negotiate a LongBet over them. If none of these predictions fit then we can maybe discuss other possible bets along similar lines. This is precisely the sort of thing that will get us to actually listen: show that your beliefs pay rent, and even better make a prediction that turns out to be correct that the standard model of things will predict is very unlikely. That’s a way to get us to sit up and take notice.
So the hypothesis is that if a group of people have a belief because of the absurdity heuristic rather than the evidence, if you show them one piece of evidence that goes against their belief they’ll rationalize something like “but what about all the other evidence?”
But if a group of people have a belief because of the evidence, and you show them one piece of evidence that goes against their belief, won’t they also say “but what about all the other evidence?”
This is problematic, since it means that “confronting” people is a bad way to get information about them, even to give them information about themselves. Rather than zigging or zagging, try zogging. What sort of tests can you think of that differentiate well between people who are thoroughly biased and people who used evidence?
First of all, thanks for the constructive argument!
One that I have thought of long ago is asking the basic question of rationality, “Why do you believe what you believe.” The result can be seen here: http://lesswrong.com/lw/5kz/the_5second_level/4c68 and here: http://lesswrong.com/lw/1ww/undiscriminating_skepticism/4c63
Needless to say, both questions were also ignored.
I don’t know what other tests could be performed, considering that the people in question are apparently not willing to participate in any.
Well, that’s still the “confronting” test. Given that people answered “because of the other evidence” in various places, either you’re wrong about people deciding irrationally, or people are rationalizing a lot (which would make it a non-discriminating test). What sort of test would discriminate between a rational-ish person and someone who originally chose because of some bias (“bias X”) and then rationalized, without requiring examination of the annotated bibliography of all the evidence someone ever considered ever?
I don’t know, do you have any suggestion?