Why? By any reasonable definition, it is fact. We shouldn’t step away from essentially proven facts just because some people make political controversies out of them. In fact these are the examples we should be bringing up more, if we want to be rational when it’s harder, not just rational when it’s easy.
In fact these are the examples we should be bringing up more, if we want to be rational when it’s harder, not just rational when it’s easy.
What exactly does “being rational” mean in this context? Rationality is a way to come to the right conclusions from the available data. If you show the data and how you reached the conclusion, you have shown rationality (assuming there is no lower-level problem, for example that you have previously filtered the data). If you only show the conclusion—well, even if it happens to be the right conclusion, you didn’t demonstrate that you have achieved it rationally.
The mere fact that someone is saying some conslusion is not a proof of rationality. It may be a wrong conclusion, but even if it is the right conclusion, it is a very weak evidence towards author’s rationality, because they might as well just profess their group beliefs. And such as people usually are, when someone is saying a conclusion without showing how they reached them, I would put a high prior probability on them professing group beliefs.
There is no utility in “trying harder” per se, only the results matter. If we want to increase the general sanity waterline, we should do things that increase the change of success, not the harder things. What exactly are we trying to do? If we are trying to signal to people with the same opinions, we could write on the LW homepage with big letters: “global warming is true, parallel universes exist, science is broken, and if you don’t believe this, you are not rational enough”—but what exactly would that achieve? I don’t think it would attract people who want to study rationality.
Choose you battles wisely. Talk about global warming when the global warming is the topic of discussion. Same about parallel universes, etc. Imagine going to a global warming conference and talking about parallel universes—does this fit under the label “being rational when it’s harder”?
My phrasing “when it’s harder, not just rational when it’s easy” was poor. Let me make my points another way.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect? Because I’ve seen very few people objecting, just people arguing that “other people” may find the phrasing disturbing, or objectionable, or whatever. If you object, say so. The audience is the less wrong crowd; if they reject the rest of the post over that one sentence, then what exactly are they doing at this website?
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field. Yes, the experts are most likely wrong, but it takes a lot of effort to see that. If someone says “I don’t believe in Everett branches and parallel universes”, I don’t conclude they are being irrational, just that they haven’t been exposed to all the arguments in excruciating detail, or are following a—generally correct—“defer to the scientific consensus” heuristic.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again. No self-censorship because some people find it “controversial”.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect?
I am not very good at estimating probabilities, but I would guess: 99% there is a global warming; 95% the human contribution is very significant; 95% in a rational world we could reduce the human contribution, though not necessarily to zero.
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field.
Climate change also requires some investigation. As an example, I have never studied anything remotely similar to climatology, and I have no idea who the experts in the field are. (I could do this, but I have limited time and different priorities.) People are giving me all kinds of data, many of them falsified, and I don’t have a background knowledge to tell the difference. So basicly in my situation, all I have is hearsay, and it’s just my decision whom to trust. (Unless I want to ignore my other priorities and invest a lot of time in this topic, which has no practical relevance to my everyday life.)
Despite all this, during years I have done some intuitive version of probabilistic reasoning; I have unconsciously noticed that some things correlate with other things (for example: people who are wrong when discussing one topic have somewhat higher probability to be wrong when discussing other topic, some styles of discussion are somewhat more probably used by people who are wrong, etc.), so gradually my model of the world started strongly suggesting that “there is a global warming” is a true statement. Yet, it is all very indirect reasoning on my part—so I can understand how a person, just as ignorant about this topic as me, could with some probability come to a different conclusion.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again.
No one is perfectly rational, right? People make all kinds of transgressions against rationality, and “rejecting the global warming consensus” seems to me like a minor one, compared with alternatives. Such person could still be in the top 1 percentile of rationality, mostly because humans generally are not very rational.
Anyway, the choice (at least as I see it) is not between “speak about global warming” or “not speak about global warming”, but between “speak about global warming in a separate article, with arguments and references” and “drop the mention in unrelated places, as applause lights”. Some people consider this approach bad even when it is about theism, which in my opinion is a hundred times larger transgression against rationality.
Writing about global warming is a good thing to do, and it belongs on LW, and avoiding it would be bad. It just should be done in a way that emphasises that we speak about rational conclusions, and not only promote our group-think. Because it is a topic where most people promote some group-think, so when this topic is introduced, there is a high prior probability that is was introduced for bad reasons.
Because it is a lot of indirect reasoning. Literally, decades of occasional information. Even weak patterns can become visible after enough exposure. I have learned even before finding LW that underconfidence is also a sin.
As an analogy: if you throw a coin 10 times, and one side comes up 6 times and other side 4 times, it does not mean much. But if you throw the same coin 1000 times, and one side comes up 600 times and other side 400 times, the coin is almost surely not fair. After many observations you see something that was not visible after few observations.
And just like I cannot throw the same coin 10 more times to convince you that it is not fair (you would have to either see all 1000 experiments, or strongly trust my rationality), there is nothing I could write in this comment to justify my probability assignment. I can only point to the indirect evidence: one relatively stronger data point would be the relative consensus of LW contributors.
Sure, lots of pieces of weak evidence can add up to strong evidence… provided they’re practically independent from each other. And since this issue gets entangled with Green vs Blue politics, the correlation between the various pieces of weak evidence might not be that small. (If the coin was always flipped by the same person, who always allowed to look which side faced which way before flipping it, they could well have used a method of flipping which systematically favoured a certain side—E.T. Jaynes’s book describes some such methods.)
That is, if you say to me “I flipped this coin 1000 times and recorded the results in this Excel spreadsheet, which shows 600 heads and 400 tails,” all I have to believe is that you really did flip the coin 1000 times and record the results. That assumes you’re honest, but sets a pretty low lower bound for your rationality.
Why? By any reasonable definition, it is fact. We shouldn’t step away from essentially proven facts just because some people make political controversies out of them. In fact these are the examples we should be bringing up more, if we want to be rational when it’s harder, not just rational when it’s easy.
and we can read LW articles while standing on our heads to make it even harder!
I general, it does not seem like a good idea to make your ideas artificially hard to understand.
What exactly does “being rational” mean in this context? Rationality is a way to come to the right conclusions from the available data. If you show the data and how you reached the conclusion, you have shown rationality (assuming there is no lower-level problem, for example that you have previously filtered the data). If you only show the conclusion—well, even if it happens to be the right conclusion, you didn’t demonstrate that you have achieved it rationally.
The mere fact that someone is saying some conslusion is not a proof of rationality. It may be a wrong conclusion, but even if it is the right conclusion, it is a very weak evidence towards author’s rationality, because they might as well just profess their group beliefs. And such as people usually are, when someone is saying a conclusion without showing how they reached them, I would put a high prior probability on them professing group beliefs.
There is no utility in “trying harder” per se, only the results matter. If we want to increase the general sanity waterline, we should do things that increase the change of success, not the harder things. What exactly are we trying to do? If we are trying to signal to people with the same opinions, we could write on the LW homepage with big letters: “global warming is true, parallel universes exist, science is broken, and if you don’t believe this, you are not rational enough”—but what exactly would that achieve? I don’t think it would attract people who want to study rationality.
Choose you battles wisely. Talk about global warming when the global warming is the topic of discussion. Same about parallel universes, etc. Imagine going to a global warming conference and talking about parallel universes—does this fit under the label “being rational when it’s harder”?
My phrasing “when it’s harder, not just rational when it’s easy” was poor. Let me make my points another way.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect? Because I’ve seen very few people objecting, just people arguing that “other people” may find the phrasing disturbing, or objectionable, or whatever. If you object, say so. The audience is the less wrong crowd; if they reject the rest of the post over that one sentence, then what exactly are they doing at this website?
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field. Yes, the experts are most likely wrong, but it takes a lot of effort to see that. If someone says “I don’t believe in Everett branches and parallel universes”, I don’t conclude they are being irrational, just that they haven’t been exposed to all the arguments in excruciating detail, or are following a—generally correct—“defer to the scientific consensus” heuristic.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again. No self-censorship because some people find it “controversial”.
I am not very good at estimating probabilities, but I would guess: 99% there is a global warming; 95% the human contribution is very significant; 95% in a rational world we could reduce the human contribution, though not necessarily to zero.
Climate change also requires some investigation. As an example, I have never studied anything remotely similar to climatology, and I have no idea who the experts in the field are. (I could do this, but I have limited time and different priorities.) People are giving me all kinds of data, many of them falsified, and I don’t have a background knowledge to tell the difference. So basicly in my situation, all I have is hearsay, and it’s just my decision whom to trust. (Unless I want to ignore my other priorities and invest a lot of time in this topic, which has no practical relevance to my everyday life.)
Despite all this, during years I have done some intuitive version of probabilistic reasoning; I have unconsciously noticed that some things correlate with other things (for example: people who are wrong when discussing one topic have somewhat higher probability to be wrong when discussing other topic, some styles of discussion are somewhat more probably used by people who are wrong, etc.), so gradually my model of the world started strongly suggesting that “there is a global warming” is a true statement. Yet, it is all very indirect reasoning on my part—so I can understand how a person, just as ignorant about this topic as me, could with some probability come to a different conclusion.
No one is perfectly rational, right? People make all kinds of transgressions against rationality, and “rejecting the global warming consensus” seems to me like a minor one, compared with alternatives. Such person could still be in the top 1 percentile of rationality, mostly because humans generally are not very rational.
Anyway, the choice (at least as I see it) is not between “speak about global warming” or “not speak about global warming”, but between “speak about global warming in a separate article, with arguments and references” and “drop the mention in unrelated places, as applause lights”. Some people consider this approach bad even when it is about theism, which in my opinion is a hundred times larger transgression against rationality.
Writing about global warming is a good thing to do, and it belongs on LW, and avoiding it would be bad. It just should be done in a way that emphasises that we speak about rational conclusions, and not only promote our group-think. Because it is a topic where most people promote some group-think, so when this topic is introduced, there is a high prior probability that is was introduced for bad reasons.
Thanks for your detailed response!
I feel the opposite—global warming denial is much worse than (mild) theism. I explain more in: http://lesswrong.com/r/discussion/lw/aw6/global_warming_is_a_better_test_of_irrationality/
And yet it leads you to a 99% probability assignment. :-/
Because it is a lot of indirect reasoning. Literally, decades of occasional information. Even weak patterns can become visible after enough exposure. I have learned even before finding LW that underconfidence is also a sin.
As an analogy: if you throw a coin 10 times, and one side comes up 6 times and other side 4 times, it does not mean much. But if you throw the same coin 1000 times, and one side comes up 600 times and other side 400 times, the coin is almost surely not fair. After many observations you see something that was not visible after few observations.
And just like I cannot throw the same coin 10 more times to convince you that it is not fair (you would have to either see all 1000 experiments, or strongly trust my rationality), there is nothing I could write in this comment to justify my probability assignment. I can only point to the indirect evidence: one relatively stronger data point would be the relative consensus of LW contributors.
Sure, lots of pieces of weak evidence can add up to strong evidence… provided they’re practically independent from each other. And since this issue gets entangled with Green vs Blue politics, the correlation between the various pieces of weak evidence might not be that small. (If the coin was always flipped by the same person, who always allowed to look which side faced which way before flipping it, they could well have used a method of flipping which systematically favoured a certain side—E.T. Jaynes’s book describes some such methods.)
Or your honesty.
That is, if you say to me “I flipped this coin 1000 times and recorded the results in this Excel spreadsheet, which shows 600 heads and 400 tails,” all I have to believe is that you really did flip the coin 1000 times and record the results. That assumes you’re honest, but sets a pretty low lower bound for your rationality.