But you don’t reject the hypothesis that “Barack Obama is a wooly mammoth” because it’s absurd—nobody has seriously presented it. If someone had a reason to seriously present it, then I’d not dismiss it out of hand—if only because I was interested enough to hear it in the first place, so would want to see if the speaker was making a clever joke, or perhaps needed immediate medical care. As EY might say, noticing a hypothesis is unlikely enough in the first place that you should probably pay some attention to it, if the speaker was one of the people you listen to. cf. Einstein’s Arrogance
David Icke thinks Barack Obama and many other prominent politicians are reptiles and that there’s a reptilian conspiracy going on. He has written many books about this, and seems to take all of that pretty seriously. Should I be reading his books, instead of something that is more likely to be true?
Imagining that someone “had a reason to seriously present” to Obama-Mammoth hypothesis is to make the hypothesis non-absurd. If there is real evidence in favor of the hypothesis, than it is obviously worth considering. But that is just to fight the example; it doesn’t tell us much about the actual line between absurd claims and claims that are worth considering.
In the world we actually inhabit, an individual who believed that they had good reasons to think that the president was an extinct quadruped would obviously be suffering from a thought disorder. It might be interesting to listen to such a person talk (or to hear a joke that begins with the O-M Hypo), but that doesn’t mean that the claim is worth considering seriously.
“If someone had a reason to seriously present it, then I’d not dismiss it out of hand ”
But they don’t.
Calling something “absurd” doesn’t mean “I have so much evidence to the contrary that I’m not going to even consider any evidence”. It seems more like a synonym of “there is no evidence for that, and the possibility space is large”
“If someone had a reason to seriously present it, then I’d not dismiss it out of hand”
It’s important to not that no one has. You can’t update on fictitious evidence.
In the (unlikely) case that something unprobable ends up with strong evidence backing it, then it becomes probable whether or not it was called “absurd”. Until then, we dismiss it because it’s absurd.
I think that absurdity, in this sense, is just an example of Occam’s Razor / Bayesian rationalty in practice. If something has a low prior, and we’ve no evidence that would make us raise our probability estimates, then we should believe that the idea probably isn’t true.
I’ve always assumed that the absurdity bias was a tendency to do something slightly different. In this context, absurdity is a measure of how closely an idea conforms to our usual experiences. It’s a measure of how plausible an idea feels to our gut. By this definition, absurdity is being used as a proxy for “low probability estimate, rationally assigned”.
It’s often a good proxy, but not always.
Or perhaps another way to put it: when evidence seems to point to an extremely unlikely conclusion, we tend to doubt the accuracy of the evidence. And the absurdity bias is a tendency to doubt the evidence more thoroughly than ideal rationality would demand.
(Admission: I’ve noticed that I’ve had some trouble defining the bias, and now I’m considering the possibility that “absurdity bias” is a less useful concept than I thought it was).
But you don’t reject the hypothesis that “Barack Obama is a wooly mammoth” because it’s absurd—nobody has seriously presented it. If someone had a reason to seriously present it, then I’d not dismiss it out of hand—if only because I was interested enough to hear it in the first place, so would want to see if the speaker was making a clever joke, or perhaps needed immediate medical care. As EY might say, noticing a hypothesis is unlikely enough in the first place that you should probably pay some attention to it, if the speaker was one of the people you listen to. cf. Einstein’s Arrogance
Funny thing is that someone actually did.
David Icke thinks Barack Obama and many other prominent politicians are reptiles and that there’s a reptilian conspiracy going on. He has written many books about this, and seems to take all of that pretty seriously. Should I be reading his books, instead of something that is more likely to be true?
Imagining that someone “had a reason to seriously present” to Obama-Mammoth hypothesis is to make the hypothesis non-absurd. If there is real evidence in favor of the hypothesis, than it is obviously worth considering. But that is just to fight the example; it doesn’t tell us much about the actual line between absurd claims and claims that are worth considering.
In the world we actually inhabit, an individual who believed that they had good reasons to think that the president was an extinct quadruped would obviously be suffering from a thought disorder. It might be interesting to listen to such a person talk (or to hear a joke that begins with the O-M Hypo), but that doesn’t mean that the claim is worth considering seriously.
“If someone had a reason to seriously present it, then I’d not dismiss it out of hand ”
But they don’t.
Calling something “absurd” doesn’t mean “I have so much evidence to the contrary that I’m not going to even consider any evidence”. It seems more like a synonym of “there is no evidence for that, and the possibility space is large”
“If someone had a reason to seriously present it, then I’d not dismiss it out of hand”
It’s important to not that no one has. You can’t update on fictitious evidence.
In the (unlikely) case that something unprobable ends up with strong evidence backing it, then it becomes probable whether or not it was called “absurd”. Until then, we dismiss it because it’s absurd.
I think that absurdity, in this sense, is just an example of Occam’s Razor / Bayesian rationalty in practice. If something has a low prior, and we’ve no evidence that would make us raise our probability estimates, then we should believe that the idea probably isn’t true.
I’ve always assumed that the absurdity bias was a tendency to do something slightly different. In this context, absurdity is a measure of how closely an idea conforms to our usual experiences. It’s a measure of how plausible an idea feels to our gut. By this definition, absurdity is being used as a proxy for “low probability estimate, rationally assigned”.
It’s often a good proxy, but not always.
Or perhaps another way to put it: when evidence seems to point to an extremely unlikely conclusion, we tend to doubt the accuracy of the evidence. And the absurdity bias is a tendency to doubt the evidence more thoroughly than ideal rationality would demand.
(Admission: I’ve noticed that I’ve had some trouble defining the bias, and now I’m considering the possibility that “absurdity bias” is a less useful concept than I thought it was).