Here’s the main thing that bothers me about this debate. There’s a set of many different questions involving the degree of past and current warming, the degree to which such warming should be attributed to humans, the degree to which future emissions would cause more warming, the degree to which future emissions will happen given different assumptions, what good and bad effects future warming can be expected to have at different times and given what assumptions (specifically, what probability we should assign to catastrophic and even existential-risk damage), what policies will mitigate the problem how much and at what cost, how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad, and how much trust we should put in different aspects of the process that produced the standard answers to these questions and alternatives to the standard answers. These are questions that empirical evidence, theory, and scientific authority bear on to different degrees, and a LessWronger ought to separate them out as a matter of habit, and yet even here some vague combination of all these questions tends to get mashed together into a vague question of whether to believe “the global warming consensus” or “the pro-global warming side”, to the point where when Stuart says some class of people is more irrational than theists, I have no idea if he’s talking about me. If the original post had said something like, “everyone whose median estimate of climate sensitivity to doubled CO2 is lower than 2 degrees Celsius is more irrational than theists”, I might still complain about it falling afoul of anti-politics norms, but at least it would help create the impression that the debate was about ideas rather than tribes.
I really like this place. What a relief to have a cogent and rational comment about the global warming debate, and how encouraging to see it lavished with a pile of karma.
how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad
Apart from those two issues, the other points you bring up are the domain of experts. Unless we are experts ourselves, or have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs. We can widen the uncertainty somewhat (we can confidently expect overconfidence :-), maybe add a very small systematic bias in one direction (to reflect possible social or political biases—the correction has to be very small as our ability to reliably estimate these factors is very poor).
I might still complain about it falling afoul of anti-politics norms, but at least it would help create the impression that the debate was about ideas rather than tribes.
Excessive anti-politics norms are a problem here—because the issue has become tribalised, we’re no longer willing to defend the rational position, or we caveat it far too much.
Unless we [...] have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs.
Well, yes, but the very fact that a question has strong ideological implications makes it highly probable that experts are biased about it. (I argued this point at greater length here.)
Presumably most of those whose opinions fall outside of whatever the acceptable range is have those opinions either because they believe they have some relevant piece of expertise, or because they believe they have some relevant information about the biases of specific experts, or because they don’t believe that their ability to estimate systematic bias is in fact “very poor”, or even because they disagree with you about what the experts think. This seems like the sort of information people might falsely convince themselves that they have, but at least if we’re no longer just looking at relatively narrow and technical questions like attribution and sensitivity but also at broader questions like policy, where expert consensus becomes harder to characterize and many different fields become relevant (including futurism and rational aggregation of evidence and weighing of considerations, which many LessWrongers are probably better at than most domain experts) the possibility that they’re right surely is not so preposterous that we can hold it up as a stronger rationality test than theism.
You’re right of course—having policy niggles or disagreement is not a good sign of irrationality. But the harder the science gets, the more disagreement becomes irrational. And I’ve seen people cycle through “global warming isn’t happening” to “it’s happening but it’s natural” to “it’s man-made but it’ll be too expensive to do anything about it” in the course of a single conversation, without seeming to realise the contradications (I’ve seen theists do the same, but this was worse).
So yes, mild anti-AGW (or anti-certain AGW policy ideas) is not a strong sign of irrationality, but I’d argue that neither is mild theism.
The irrational thing, and I see it often, is people who believe “nothing should be done about global warming and therefore at least one of the questions above has an answer of ‘none’”. Obviously, they don’t use those words. But when someone switches freely from “the earth isn’t getting warmer” to “the fact that the earth is getting warmer is part of a natural climate cycle” there’s something wrong with that person’s thinking.
Wouldn’t denial of AGW equate to one of the following beliefs?
Climate sensitivity to doubled CO2 is zero, less than zero, or so poorly defined that it could straddle either side of zero, depending on the precise definition.
Increased levels of CO2 have nothing to do with human activity.
Anyone who believes in ~1 and ~2 must believe in some degree of AGW, even if they further believe it is trivial, or masked by natural climate variations.
Belief that sensitivity is below 2 degrees doesn’t seem utterly unreasonable, given that the typical IPCC estimate is 3 degrees +/- 1 degree, the confidence interval is not more than 2 sigma either way (“likely” rather than “very likely” in IPCC parlance) and building that confidence interval involves conditioning on lots of different sorts of evidence. Belief that sensitivity is below 1 degree does seem like having an axe to grind.
All this is Charney or “fast feedback” sensitivity. The biggest concern is the growing evidence that ultimate (slow feedback) sensitivity is much bigger than Charney (at least 30% bigger, and plausibly 100% bigger). Also, that there are carbon cycle and other GHG feedbacks (like methane), so the long-run impact of AGW includes much more than our own CO2 emissions. Multiplying all the new factors together tuns a central estimate of 3 degrees into a central estimate of more than 6 degrees, and then things really do look very worrying indeed (temperatures during the last ice age were only 5-6 degrees less than today; at temperatures 6 degrees more than today there have been no polar ice caps at all).
Here’s the main thing that bothers me about this debate. There’s a set of many different questions involving the degree of past and current warming, the degree to which such warming should be attributed to humans, the degree to which future emissions would cause more warming, the degree to which future emissions will happen given different assumptions, what good and bad effects future warming can be expected to have at different times and given what assumptions (specifically, what probability we should assign to catastrophic and even existential-risk damage), what policies will mitigate the problem how much and at what cost, how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad, and how much trust we should put in different aspects of the process that produced the standard answers to these questions and alternatives to the standard answers. These are questions that empirical evidence, theory, and scientific authority bear on to different degrees, and a LessWronger ought to separate them out as a matter of habit, and yet even here some vague combination of all these questions tends to get mashed together into a vague question of whether to believe “the global warming consensus” or “the pro-global warming side”, to the point where when Stuart says some class of people is more irrational than theists, I have no idea if he’s talking about me. If the original post had said something like, “everyone whose median estimate of climate sensitivity to doubled CO2 is lower than 2 degrees Celsius is more irrational than theists”, I might still complain about it falling afoul of anti-politics norms, but at least it would help create the impression that the debate was about ideas rather than tribes.
I really like this place. What a relief to have a cogent and rational comment about the global warming debate, and how encouraging to see it lavished with a pile of karma.
And if we’re going to talk on the level of tribes anyway then at least use reasoning like this.
Very nice reasoning in that post. But this is Less Wrong! We’re aiming for the truth, not for some complicated political position.
Apart from those two issues, the other points you bring up are the domain of experts. Unless we are experts ourselves, or have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs. We can widen the uncertainty somewhat (we can confidently expect overconfidence :-), maybe add a very small systematic bias in one direction (to reflect possible social or political biases—the correction has to be very small as our ability to reliably estimate these factors is very poor).
Excessive anti-politics norms are a problem here—because the issue has become tribalised, we’re no longer willing to defend the rational position, or we caveat it far too much.
Well, yes, but the very fact that a question has strong ideological implications makes it highly probable that experts are biased about it. (I argued this point at greater length here.)
Presumably most of those whose opinions fall outside of whatever the acceptable range is have those opinions either because they believe they have some relevant piece of expertise, or because they believe they have some relevant information about the biases of specific experts, or because they don’t believe that their ability to estimate systematic bias is in fact “very poor”, or even because they disagree with you about what the experts think. This seems like the sort of information people might falsely convince themselves that they have, but at least if we’re no longer just looking at relatively narrow and technical questions like attribution and sensitivity but also at broader questions like policy, where expert consensus becomes harder to characterize and many different fields become relevant (including futurism and rational aggregation of evidence and weighing of considerations, which many LessWrongers are probably better at than most domain experts) the possibility that they’re right surely is not so preposterous that we can hold it up as a stronger rationality test than theism.
You’re right of course—having policy niggles or disagreement is not a good sign of irrationality. But the harder the science gets, the more disagreement becomes irrational. And I’ve seen people cycle through “global warming isn’t happening” to “it’s happening but it’s natural” to “it’s man-made but it’ll be too expensive to do anything about it” in the course of a single conversation, without seeming to realise the contradications (I’ve seen theists do the same, but this was worse).
So yes, mild anti-AGW (or anti-certain AGW policy ideas) is not a strong sign of irrationality, but I’d argue that neither is mild theism.
The irrational thing, and I see it often, is people who believe “nothing should be done about global warming and therefore at least one of the questions above has an answer of ‘none’”. Obviously, they don’t use those words. But when someone switches freely from “the earth isn’t getting warmer” to “the fact that the earth is getting warmer is part of a natural climate cycle” there’s something wrong with that person’s thinking.
Wouldn’t denial of AGW equate to one of the following beliefs?
Climate sensitivity to doubled CO2 is zero, less than zero, or so poorly defined that it could straddle either side of zero, depending on the precise definition.
Increased levels of CO2 have nothing to do with human activity.
Anyone who believes in ~1 and ~2 must believe in some degree of AGW, even if they further believe it is trivial, or masked by natural climate variations.
Belief that sensitivity is below 2 degrees doesn’t seem utterly unreasonable, given that the typical IPCC estimate is 3 degrees +/- 1 degree, the confidence interval is not more than 2 sigma either way (“likely” rather than “very likely” in IPCC parlance) and building that confidence interval involves conditioning on lots of different sorts of evidence. Belief that sensitivity is below 1 degree does seem like having an axe to grind.
All this is Charney or “fast feedback” sensitivity. The biggest concern is the growing evidence that ultimate (slow feedback) sensitivity is much bigger than Charney (at least 30% bigger, and plausibly 100% bigger). Also, that there are carbon cycle and other GHG feedbacks (like methane), so the long-run impact of AGW includes much more than our own CO2 emissions. Multiplying all the new factors together tuns a central estimate of 3 degrees into a central estimate of more than 6 degrees, and then things really do look very worrying indeed (temperatures during the last ice age were only 5-6 degrees less than today; at temperatures 6 degrees more than today there have been no polar ice caps at all).