The overall framework is sensible, but I have trouble applying it to the most vexing cases: where the respected elites mostly just giggle at a claim and seem to refuse to even think about reasons for or against it, but instead just confidently reject it. It might seem to me that their usual intellectual standards would require that they engage in such reasoning, but the fact that they do not in fact think that appropriate in this case is evidence of something. But what?
I think it is evidence that thinking about it carefully wouldn’t advance their current concerns, so they don’t bother or use the thinking/talking for other purposes. Here are some possibilities that come to mind:
they might not care about the outcomes that you think are decision-relevant and associated with your claim
they may care about the outcomes, but your claim may not actually be decision-relevant if you were to find out the truth about the claim
it may not be a claim which, if thought about carefully, would contribute enough additional evidence to change your probability in the claim enough to change decisions
it may be that you haven’t framed your arguments in a way that suggests to people that there is a promising enough path to getting info that would become decision-relevant
it may be because of a signalling hypothesis that you would come up with; if you’re talking about the distant future, maybe people mostly talk about such stuff as part of a system of behavior that signals support for certain perspectives. If this is happening more in this kind of case, it may be in part because of the other considerations.
In general I think that the overall framework suffers from the (very big, IMHO) problem of ignoring the interests of various people and the consequent incentives.People in real life are not impartial beings of pure rationality—they will and routinely do filter the evidence, misrepresent it, occasionally straight out invent it, all for the purpose of advancing their interests. And that’s on top of people sincerely believing what is useful/profitable for them to believe.
The overall framework is sensible, but I have trouble applying it to the most vexing cases: where the respected elites mostly just giggle at a claim and seem to refuse to even think about reasons for or against it, but instead just confidently reject it. It might seem to me that their usual intellectual standards would require that they engage in such reasoning, but the fact that they do not in fact think that appropriate in this case is evidence of something. But what?
I think it is evidence that thinking about it carefully wouldn’t advance their current concerns, so they don’t bother or use the thinking/talking for other purposes. Here are some possibilities that come to mind:
they might not care about the outcomes that you think are decision-relevant and associated with your claim
they may care about the outcomes, but your claim may not actually be decision-relevant if you were to find out the truth about the claim
it may not be a claim which, if thought about carefully, would contribute enough additional evidence to change your probability in the claim enough to change decisions
it may be that you haven’t framed your arguments in a way that suggests to people that there is a promising enough path to getting info that would become decision-relevant
it may be because of a signalling hypothesis that you would come up with; if you’re talking about the distant future, maybe people mostly talk about such stuff as part of a system of behavior that signals support for certain perspectives. If this is happening more in this kind of case, it may be in part because of the other considerations.
Evidence of incentives working, I think.
In general I think that the overall framework suffers from the (very big, IMHO) problem of ignoring the interests of various people and the consequent incentives.People in real life are not impartial beings of pure rationality—they will and routinely do filter the evidence, misrepresent it, occasionally straight out invent it, all for the purpose of advancing their interests. And that’s on top of people sincerely believing what is useful/profitable for them to believe.