how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad
Apart from those two issues, the other points you bring up are the domain of experts. Unless we are experts ourselves, or have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs. We can widen the uncertainty somewhat (we can confidently expect overconfidence :-), maybe add a very small systematic bias in one direction (to reflect possible social or political biases—the correction has to be very small as our ability to reliably estimate these factors is very poor).
I might still complain about it falling afoul of anti-politics norms, but at least it would help create the impression that the debate was about ideas rather than tribes.
Excessive anti-politics norms are a problem here—because the issue has become tribalised, we’re no longer willing to defend the rational position, or we caveat it far too much.
Unless we [...] have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs.
Well, yes, but the very fact that a question has strong ideological implications makes it highly probable that experts are biased about it. (I argued this point at greater length here.)
Presumably most of those whose opinions fall outside of whatever the acceptable range is have those opinions either because they believe they have some relevant piece of expertise, or because they believe they have some relevant information about the biases of specific experts, or because they don’t believe that their ability to estimate systematic bias is in fact “very poor”, or even because they disagree with you about what the experts think. This seems like the sort of information people might falsely convince themselves that they have, but at least if we’re no longer just looking at relatively narrow and technical questions like attribution and sensitivity but also at broader questions like policy, where expert consensus becomes harder to characterize and many different fields become relevant (including futurism and rational aggregation of evidence and weighing of considerations, which many LessWrongers are probably better at than most domain experts) the possibility that they’re right surely is not so preposterous that we can hold it up as a stronger rationality test than theism.
You’re right of course—having policy niggles or disagreement is not a good sign of irrationality. But the harder the science gets, the more disagreement becomes irrational. And I’ve seen people cycle through “global warming isn’t happening” to “it’s happening but it’s natural” to “it’s man-made but it’ll be too expensive to do anything about it” in the course of a single conversation, without seeming to realise the contradications (I’ve seen theists do the same, but this was worse).
So yes, mild anti-AGW (or anti-certain AGW policy ideas) is not a strong sign of irrationality, but I’d argue that neither is mild theism.
Apart from those two issues, the other points you bring up are the domain of experts. Unless we are experts ourselves, or have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs. We can widen the uncertainty somewhat (we can confidently expect overconfidence :-), maybe add a very small systematic bias in one direction (to reflect possible social or political biases—the correction has to be very small as our ability to reliably estimate these factors is very poor).
Excessive anti-politics norms are a problem here—because the issue has become tribalised, we’re no longer willing to defend the rational position, or we caveat it far too much.
Well, yes, but the very fact that a question has strong ideological implications makes it highly probable that experts are biased about it. (I argued this point at greater length here.)
Presumably most of those whose opinions fall outside of whatever the acceptable range is have those opinions either because they believe they have some relevant piece of expertise, or because they believe they have some relevant information about the biases of specific experts, or because they don’t believe that their ability to estimate systematic bias is in fact “very poor”, or even because they disagree with you about what the experts think. This seems like the sort of information people might falsely convince themselves that they have, but at least if we’re no longer just looking at relatively narrow and technical questions like attribution and sensitivity but also at broader questions like policy, where expert consensus becomes harder to characterize and many different fields become relevant (including futurism and rational aggregation of evidence and weighing of considerations, which many LessWrongers are probably better at than most domain experts) the possibility that they’re right surely is not so preposterous that we can hold it up as a stronger rationality test than theism.
You’re right of course—having policy niggles or disagreement is not a good sign of irrationality. But the harder the science gets, the more disagreement becomes irrational. And I’ve seen people cycle through “global warming isn’t happening” to “it’s happening but it’s natural” to “it’s man-made but it’ll be too expensive to do anything about it” in the course of a single conversation, without seeming to realise the contradications (I’ve seen theists do the same, but this was worse).
So yes, mild anti-AGW (or anti-certain AGW policy ideas) is not a strong sign of irrationality, but I’d argue that neither is mild theism.