My phrasing “when it’s harder, not just rational when it’s easy” was poor. Let me make my points another way.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect? Because I’ve seen very few people objecting, just people arguing that “other people” may find the phrasing disturbing, or objectionable, or whatever. If you object, say so. The audience is the less wrong crowd; if they reject the rest of the post over that one sentence, then what exactly are they doing at this website?
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field. Yes, the experts are most likely wrong, but it takes a lot of effort to see that. If someone says “I don’t believe in Everett branches and parallel universes”, I don’t conclude they are being irrational, just that they haven’t been exposed to all the arguments in excruciating detail, or are following a—generally correct—“defer to the scientific consensus” heuristic.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again. No self-censorship because some people find it “controversial”.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect?
I am not very good at estimating probabilities, but I would guess: 99% there is a global warming; 95% the human contribution is very significant; 95% in a rational world we could reduce the human contribution, though not necessarily to zero.
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field.
Climate change also requires some investigation. As an example, I have never studied anything remotely similar to climatology, and I have no idea who the experts in the field are. (I could do this, but I have limited time and different priorities.) People are giving me all kinds of data, many of them falsified, and I don’t have a background knowledge to tell the difference. So basicly in my situation, all I have is hearsay, and it’s just my decision whom to trust. (Unless I want to ignore my other priorities and invest a lot of time in this topic, which has no practical relevance to my everyday life.)
Despite all this, during years I have done some intuitive version of probabilistic reasoning; I have unconsciously noticed that some things correlate with other things (for example: people who are wrong when discussing one topic have somewhat higher probability to be wrong when discussing other topic, some styles of discussion are somewhat more probably used by people who are wrong, etc.), so gradually my model of the world started strongly suggesting that “there is a global warming” is a true statement. Yet, it is all very indirect reasoning on my part—so I can understand how a person, just as ignorant about this topic as me, could with some probability come to a different conclusion.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again.
No one is perfectly rational, right? People make all kinds of transgressions against rationality, and “rejecting the global warming consensus” seems to me like a minor one, compared with alternatives. Such person could still be in the top 1 percentile of rationality, mostly because humans generally are not very rational.
Anyway, the choice (at least as I see it) is not between “speak about global warming” or “not speak about global warming”, but between “speak about global warming in a separate article, with arguments and references” and “drop the mention in unrelated places, as applause lights”. Some people consider this approach bad even when it is about theism, which in my opinion is a hundred times larger transgression against rationality.
Writing about global warming is a good thing to do, and it belongs on LW, and avoiding it would be bad. It just should be done in a way that emphasises that we speak about rational conclusions, and not only promote our group-think. Because it is a topic where most people promote some group-think, so when this topic is introduced, there is a high prior probability that is was introduced for bad reasons.
Because it is a lot of indirect reasoning. Literally, decades of occasional information. Even weak patterns can become visible after enough exposure. I have learned even before finding LW that underconfidence is also a sin.
As an analogy: if you throw a coin 10 times, and one side comes up 6 times and other side 4 times, it does not mean much. But if you throw the same coin 1000 times, and one side comes up 600 times and other side 400 times, the coin is almost surely not fair. After many observations you see something that was not visible after few observations.
And just like I cannot throw the same coin 10 more times to convince you that it is not fair (you would have to either see all 1000 experiments, or strongly trust my rationality), there is nothing I could write in this comment to justify my probability assignment. I can only point to the indirect evidence: one relatively stronger data point would be the relative consensus of LW contributors.
Sure, lots of pieces of weak evidence can add up to strong evidence… provided they’re practically independent from each other. And since this issue gets entangled with Green vs Blue politics, the correlation between the various pieces of weak evidence might not be that small. (If the coin was always flipped by the same person, who always allowed to look which side faced which way before flipping it, they could well have used a method of flipping which systematically favoured a certain side—E.T. Jaynes’s book describes some such methods.)
That is, if you say to me “I flipped this coin 1000 times and recorded the results in this Excel spreadsheet, which shows 600 heads and 400 tails,” all I have to believe is that you really did flip the coin 1000 times and record the results. That assumes you’re honest, but sets a pretty low lower bound for your rationality.
My phrasing “when it’s harder, not just rational when it’s easy” was poor. Let me make my points another way.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect? Because I’ve seen very few people objecting, just people arguing that “other people” may find the phrasing disturbing, or objectionable, or whatever. If you object, say so. The audience is the less wrong crowd; if they reject the rest of the post over that one sentence, then what exactly are they doing at this website?
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field. Yes, the experts are most likely wrong, but it takes a lot of effort to see that. If someone says “I don’t believe in Everett branches and parallel universes”, I don’t conclude they are being irrational, just that they haven’t been exposed to all the arguments in excruciating detail, or are following a—generally correct—“defer to the scientific consensus” heuristic.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again. No self-censorship because some people find it “controversial”.
I am not very good at estimating probabilities, but I would guess: 99% there is a global warming; 95% the human contribution is very significant; 95% in a rational world we could reduce the human contribution, though not necessarily to zero.
Climate change also requires some investigation. As an example, I have never studied anything remotely similar to climatology, and I have no idea who the experts in the field are. (I could do this, but I have limited time and different priorities.) People are giving me all kinds of data, many of them falsified, and I don’t have a background knowledge to tell the difference. So basicly in my situation, all I have is hearsay, and it’s just my decision whom to trust. (Unless I want to ignore my other priorities and invest a lot of time in this topic, which has no practical relevance to my everyday life.)
Despite all this, during years I have done some intuitive version of probabilistic reasoning; I have unconsciously noticed that some things correlate with other things (for example: people who are wrong when discussing one topic have somewhat higher probability to be wrong when discussing other topic, some styles of discussion are somewhat more probably used by people who are wrong, etc.), so gradually my model of the world started strongly suggesting that “there is a global warming” is a true statement. Yet, it is all very indirect reasoning on my part—so I can understand how a person, just as ignorant about this topic as me, could with some probability come to a different conclusion.
No one is perfectly rational, right? People make all kinds of transgressions against rationality, and “rejecting the global warming consensus” seems to me like a minor one, compared with alternatives. Such person could still be in the top 1 percentile of rationality, mostly because humans generally are not very rational.
Anyway, the choice (at least as I see it) is not between “speak about global warming” or “not speak about global warming”, but between “speak about global warming in a separate article, with arguments and references” and “drop the mention in unrelated places, as applause lights”. Some people consider this approach bad even when it is about theism, which in my opinion is a hundred times larger transgression against rationality.
Writing about global warming is a good thing to do, and it belongs on LW, and avoiding it would be bad. It just should be done in a way that emphasises that we speak about rational conclusions, and not only promote our group-think. Because it is a topic where most people promote some group-think, so when this topic is introduced, there is a high prior probability that is was introduced for bad reasons.
Thanks for your detailed response!
I feel the opposite—global warming denial is much worse than (mild) theism. I explain more in: http://lesswrong.com/r/discussion/lw/aw6/global_warming_is_a_better_test_of_irrationality/
And yet it leads you to a 99% probability assignment. :-/
Because it is a lot of indirect reasoning. Literally, decades of occasional information. Even weak patterns can become visible after enough exposure. I have learned even before finding LW that underconfidence is also a sin.
As an analogy: if you throw a coin 10 times, and one side comes up 6 times and other side 4 times, it does not mean much. But if you throw the same coin 1000 times, and one side comes up 600 times and other side 400 times, the coin is almost surely not fair. After many observations you see something that was not visible after few observations.
And just like I cannot throw the same coin 10 more times to convince you that it is not fair (you would have to either see all 1000 experiments, or strongly trust my rationality), there is nothing I could write in this comment to justify my probability assignment. I can only point to the indirect evidence: one relatively stronger data point would be the relative consensus of LW contributors.
Sure, lots of pieces of weak evidence can add up to strong evidence… provided they’re practically independent from each other. And since this issue gets entangled with Green vs Blue politics, the correlation between the various pieces of weak evidence might not be that small. (If the coin was always flipped by the same person, who always allowed to look which side faced which way before flipping it, they could well have used a method of flipping which systematically favoured a certain side—E.T. Jaynes’s book describes some such methods.)
Or your honesty.
That is, if you say to me “I flipped this coin 1000 times and recorded the results in this Excel spreadsheet, which shows 600 heads and 400 tails,” all I have to believe is that you really did flip the coin 1000 times and record the results. That assumes you’re honest, but sets a pretty low lower bound for your rationality.