What David_G said. Global warming is a scientific issue. Maybe “what we lack is the will to change things” is the right analysis of the policy problems, but among climate change experts there’s a whole lot more consensus about global warming than there is among AI researchers about the Singularity. “You can’t say controversial things about global warming, but can say even more controversial things about AI” is a rule that makes about as much sense as “teach the controversy” about evolution.
Evolution is also a political issue. Shall we now refrain from talking about evolution, or mentioning what widespread refusal to accept evolution, up to the point of there being a strong movement to undermine the teaching of evolution in US schools, says about human rationality?
I get that it can be especially hard to think rationally about politics. And I agree with what Eliezer has written about government policy being complex and almost always involving some trade-offs, so that we should be careful about thinking there’s an obvious “rationalist view” on policy questions.
However, a ban on discussing issues that happen to be politicized is idiotic, because it puts us at the mercy of contingent facts about what forms of irrationality happen to be prevalent in political discussion at this time. Evolution is a prime example of this. Also, if the singularity became a political issue, would we ban discussion of that from LessWrong?
We should not insert political issues which are not relevant to the topic, because the more political issues one brings to the discussion, the less rational it becomes. It would be most safe to discuss all issues separately, but sometimes is it not possible, e.g. when the topic being discussed relies heavily on evolution.
One part of trying to be rational is to accept that people are not rational, and act accordingly. For every political topic there is a number of people whose minds will turn off if they read something they disagree with. It does not mean we should be quiet on the topic, but we should not insert it where it is not relevant.
Explaining why X is true, in a separate article, is correct approach. Saying or suggesting something like “by the way, people who don’t think X is true are wrong” in an unrelated topic, is wrong approach. Why is it so? In the first example you expect your proof of X to be discussed in the comments, because it is the issue. In the second example, discussions about X in comments are off-topic. Asserting X in a place where discussion of X is unwelcome, is a kind of Dark Arts; we should avoid it even if we think X is true.
The topic of evolution, unlike the topic of climate change, is entangled with human psychology, AI, and many other important topics; not discussing it would be highly costly. Moreover, if anyone on LessWrong disagrees with evolution, it’s probably along Newsomian eccentric lines, not along tribal political lines. Also, lukeprog’s comments on the subject made implicit claims about the policy implications of the science, not just about the science itself, which in turn is less clear-cut than the scientific case against a hypothesis requiring a supernatural agent, though for God’s sake please nobody start arguing about exactly how clear-cut.
As a matter of basic netiquette, please use words like “mistaken” or “harmful” instead of “idiotic” to describe views you disagree with.
This post is mostly directed at newbies, which aren’t supposed to be trained in trying to keep their brain from shutting down whenever the “politics” pattern matcher goes off.
In other words, it could cause some readers to stop reading before they get to the gist of the post. Even at Hacker News, I sometimes see “I stopped reading at this point” posts.
Also, I see zero benefit from mentioning global warming specifically in this post. Even a slight drawback outweigh zero benefit.
Oh dear… I admit I hadn’t thought of the folks who will literally stop reading when they hit a political opinion they don’t like. Yeah, I’ve encountered them. Though I think they have bigger problems than not knowing how to fix science, and don’t think mentioning AGW did zero for this post.
(I don’t necessarily disagree with your points, I was simply making a relevant factual claim; yet you seem to have unhesitatingly interpreted my factual claim as automatically implying all sorts of things about what policies I would or would not endorse. Hm...)
I didn’t interpret it as anything about what gov. policies you’d endorse. I did infer you agreed with Steven’s comment. But anyway, my first comment may not have been clear enough, and I think the second comment should be a useful explication of the first one.
(Actually, I meant to type “Maybe… isn’t the right analysis...” or “Maybe… is the wrong analysis...” That was intended as acknowledgement of the reasons to be cautious about talking policy. But I botched that part. Oops.)
I didn’t interpret it as anything about what gov. policies you’d endorse
By “policies” I meant “norms of discourse on Less Wrong”. I don’t have any strong opinions about them; I don’t unhesitatingly agree with Steven’s opinion. Anyway I’m glad this thread didn’t end up in needless animosity; I’m worried that discussing discussing global warming, or more generally discussing what should be discussed, might be more heated than discussing global warming itself.
As for the difference with the singularity, views on that are not divided much along tribal political lines (ETA: as you acknowledge), and LessWrong seems much better placed to have positive influence there because the topic has received much less attention, because of LessWrong’s strong connection (sociological if nothing else) with the Singularity Institute, and because it’s a lot more likely to amount to an existential risk in the eyes of most people here of any political persuasion, though again let’s not discuss whether they’re right.
The point of politics is the mind-killer is that one shouldn’t use politically-charged examples when they’re not on-topic. This is exactly that case. The article is not about global warming, so it should not make mention of global warming, because that topic makes some people go insane.
This does not mean that there cannot be a post about global warming (to the extent that it’s on-topic for the site).
Please don’t insert gratuitous politics into LessWrong posts.
Basic science isn’t political here. Things like “Humans cause global warming; There is no God; Humans evolved from apes” are politicized some places but here they are just premises. There is no need to drag in political baggage by making “This Is Politics!” declarations in cases like this.
Do you see how the claim that “humans cause global warming” differs from the claim quoted in the grandparent comment?
It takes the form of a premise in a sentence that goes on to use it as an illustration of another known problem that isn’t realistically going to be solved any time soon. I don’t think I accept your implication here and in my judgement you introduced politics, not the opening post and so it is your comment that I would wish to see discouraged.
That’s hardly gratuitous. Don’t fall prey to the “‘politics is the mind killer’ is the mind killer” effect. Not all mentions of Hitler are Godwinations.
I would have been less bothered if the comparison gave insight into the structure of the problem beyond just the claim that solutions exist and people don’t have the will to implement them, and/or if it had been presented as opinion rather than fact.
Why? By any reasonable definition, it is fact. We shouldn’t step away from essentially proven facts just because some people make political controversies out of them. In fact these are the examples we should be bringing up more, if we want to be rational when it’s harder, not just rational when it’s easy.
In fact these are the examples we should be bringing up more, if we want to be rational when it’s harder, not just rational when it’s easy.
What exactly does “being rational” mean in this context? Rationality is a way to come to the right conclusions from the available data. If you show the data and how you reached the conclusion, you have shown rationality (assuming there is no lower-level problem, for example that you have previously filtered the data). If you only show the conclusion—well, even if it happens to be the right conclusion, you didn’t demonstrate that you have achieved it rationally.
The mere fact that someone is saying some conslusion is not a proof of rationality. It may be a wrong conclusion, but even if it is the right conclusion, it is a very weak evidence towards author’s rationality, because they might as well just profess their group beliefs. And such as people usually are, when someone is saying a conclusion without showing how they reached them, I would put a high prior probability on them professing group beliefs.
There is no utility in “trying harder” per se, only the results matter. If we want to increase the general sanity waterline, we should do things that increase the change of success, not the harder things. What exactly are we trying to do? If we are trying to signal to people with the same opinions, we could write on the LW homepage with big letters: “global warming is true, parallel universes exist, science is broken, and if you don’t believe this, you are not rational enough”—but what exactly would that achieve? I don’t think it would attract people who want to study rationality.
Choose you battles wisely. Talk about global warming when the global warming is the topic of discussion. Same about parallel universes, etc. Imagine going to a global warming conference and talking about parallel universes—does this fit under the label “being rational when it’s harder”?
My phrasing “when it’s harder, not just rational when it’s easy” was poor. Let me make my points another way.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect? Because I’ve seen very few people objecting, just people arguing that “other people” may find the phrasing disturbing, or objectionable, or whatever. If you object, say so. The audience is the less wrong crowd; if they reject the rest of the post over that one sentence, then what exactly are they doing at this website?
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field. Yes, the experts are most likely wrong, but it takes a lot of effort to see that. If someone says “I don’t believe in Everett branches and parallel universes”, I don’t conclude they are being irrational, just that they haven’t been exposed to all the arguments in excruciating detail, or are following a—generally correct—“defer to the scientific consensus” heuristic.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again. No self-censorship because some people find it “controversial”.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect?
I am not very good at estimating probabilities, but I would guess: 99% there is a global warming; 95% the human contribution is very significant; 95% in a rational world we could reduce the human contribution, though not necessarily to zero.
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field.
Climate change also requires some investigation. As an example, I have never studied anything remotely similar to climatology, and I have no idea who the experts in the field are. (I could do this, but I have limited time and different priorities.) People are giving me all kinds of data, many of them falsified, and I don’t have a background knowledge to tell the difference. So basicly in my situation, all I have is hearsay, and it’s just my decision whom to trust. (Unless I want to ignore my other priorities and invest a lot of time in this topic, which has no practical relevance to my everyday life.)
Despite all this, during years I have done some intuitive version of probabilistic reasoning; I have unconsciously noticed that some things correlate with other things (for example: people who are wrong when discussing one topic have somewhat higher probability to be wrong when discussing other topic, some styles of discussion are somewhat more probably used by people who are wrong, etc.), so gradually my model of the world started strongly suggesting that “there is a global warming” is a true statement. Yet, it is all very indirect reasoning on my part—so I can understand how a person, just as ignorant about this topic as me, could with some probability come to a different conclusion.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again.
No one is perfectly rational, right? People make all kinds of transgressions against rationality, and “rejecting the global warming consensus” seems to me like a minor one, compared with alternatives. Such person could still be in the top 1 percentile of rationality, mostly because humans generally are not very rational.
Anyway, the choice (at least as I see it) is not between “speak about global warming” or “not speak about global warming”, but between “speak about global warming in a separate article, with arguments and references” and “drop the mention in unrelated places, as applause lights”. Some people consider this approach bad even when it is about theism, which in my opinion is a hundred times larger transgression against rationality.
Writing about global warming is a good thing to do, and it belongs on LW, and avoiding it would be bad. It just should be done in a way that emphasises that we speak about rational conclusions, and not only promote our group-think. Because it is a topic where most people promote some group-think, so when this topic is introduced, there is a high prior probability that is was introduced for bad reasons.
Because it is a lot of indirect reasoning. Literally, decades of occasional information. Even weak patterns can become visible after enough exposure. I have learned even before finding LW that underconfidence is also a sin.
As an analogy: if you throw a coin 10 times, and one side comes up 6 times and other side 4 times, it does not mean much. But if you throw the same coin 1000 times, and one side comes up 600 times and other side 400 times, the coin is almost surely not fair. After many observations you see something that was not visible after few observations.
And just like I cannot throw the same coin 10 more times to convince you that it is not fair (you would have to either see all 1000 experiments, or strongly trust my rationality), there is nothing I could write in this comment to justify my probability assignment. I can only point to the indirect evidence: one relatively stronger data point would be the relative consensus of LW contributors.
Sure, lots of pieces of weak evidence can add up to strong evidence… provided they’re practically independent from each other. And since this issue gets entangled with Green vs Blue politics, the correlation between the various pieces of weak evidence might not be that small. (If the coin was always flipped by the same person, who always allowed to look which side faced which way before flipping it, they could well have used a method of flipping which systematically favoured a certain side—E.T. Jaynes’s book describes some such methods.)
That is, if you say to me “I flipped this coin 1000 times and recorded the results in this Excel spreadsheet, which shows 600 heads and 400 tails,” all I have to believe is that you really did flip the coin 1000 times and record the results. That assumes you’re honest, but sets a pretty low lower bound for your rationality.
Please don’t insert gratuitous politics into LessWrong posts.
I removed the global warming phrase.
Thanks!
What David_G said. Global warming is a scientific issue. Maybe “what we lack is the will to change things” is the right analysis of the policy problems, but among climate change experts there’s a whole lot more consensus about global warming than there is among AI researchers about the Singularity. “You can’t say controversial things about global warming, but can say even more controversial things about AI” is a rule that makes about as much sense as “teach the controversy” about evolution.
...and what to do about it is a political issue.
It’s also a political issue, to a much greater extent than the possibility and nature of a technological singularity.
Evolution is also a political issue. Shall we now refrain from talking about evolution, or mentioning what widespread refusal to accept evolution, up to the point of there being a strong movement to undermine the teaching of evolution in US schools, says about human rationality?
I get that it can be especially hard to think rationally about politics. And I agree with what Eliezer has written about government policy being complex and almost always involving some trade-offs, so that we should be careful about thinking there’s an obvious “rationalist view” on policy questions.
However, a ban on discussing issues that happen to be politicized is idiotic, because it puts us at the mercy of contingent facts about what forms of irrationality happen to be prevalent in political discussion at this time. Evolution is a prime example of this. Also, if the singularity became a political issue, would we ban discussion of that from LessWrong?
We should not insert political issues which are not relevant to the topic, because the more political issues one brings to the discussion, the less rational it becomes. It would be most safe to discuss all issues separately, but sometimes is it not possible, e.g. when the topic being discussed relies heavily on evolution.
One part of trying to be rational is to accept that people are not rational, and act accordingly. For every political topic there is a number of people whose minds will turn off if they read something they disagree with. It does not mean we should be quiet on the topic, but we should not insert it where it is not relevant.
Explaining why X is true, in a separate article, is correct approach. Saying or suggesting something like “by the way, people who don’t think X is true are wrong” in an unrelated topic, is wrong approach. Why is it so? In the first example you expect your proof of X to be discussed in the comments, because it is the issue. In the second example, discussions about X in comments are off-topic. Asserting X in a place where discussion of X is unwelcome, is a kind of Dark Arts; we should avoid it even if we think X is true.
The topic of evolution, unlike the topic of climate change, is entangled with human psychology, AI, and many other important topics; not discussing it would be highly costly. Moreover, if anyone on LessWrong disagrees with evolution, it’s probably along Newsomian eccentric lines, not along tribal political lines. Also, lukeprog’s comments on the subject made implicit claims about the policy implications of the science, not just about the science itself, which in turn is less clear-cut than the scientific case against a hypothesis requiring a supernatural agent, though for God’s sake please nobody start arguing about exactly how clear-cut.
As a matter of basic netiquette, please use words like “mistaken” or “harmful” instead of “idiotic” to describe views you disagree with.
This post is mostly directed at newbies, which aren’t supposed to be trained in trying to keep their brain from shutting down whenever the “politics” pattern matcher goes off.
In other words, it could cause some readers to stop reading before they get to the gist of the post. Even at Hacker News, I sometimes see “I stopped reading at this point” posts.
Also, I see zero benefit from mentioning global warming specifically in this post. Even a slight drawback outweigh zero benefit.
Oh dear… I admit I hadn’t thought of the folks who will literally stop reading when they hit a political opinion they don’t like. Yeah, I’ve encountered them. Though I think they have bigger problems than not knowing how to fix science, and don’t think mentioning AGW did zero for this post.
(I don’t necessarily disagree with your points, I was simply making a relevant factual claim; yet you seem to have unhesitatingly interpreted my factual claim as automatically implying all sorts of things about what policies I would or would not endorse. Hm...)
I didn’t interpret it as anything about what gov. policies you’d endorse. I did infer you agreed with Steven’s comment. But anyway, my first comment may not have been clear enough, and I think the second comment should be a useful explication of the first one.
(Actually, I meant to type “Maybe… isn’t the right analysis...” or “Maybe… is the wrong analysis...” That was intended as acknowledgement of the reasons to be cautious about talking policy. But I botched that part. Oops.)
By “policies” I meant “norms of discourse on Less Wrong”. I don’t have any strong opinions about them; I don’t unhesitatingly agree with Steven’s opinion. Anyway I’m glad this thread didn’t end up in needless animosity; I’m worried that discussing discussing global warming, or more generally discussing what should be discussed, might be more heated than discussing global warming itself.
Yeah. I thought of making another thread for this issue.
As for the difference with the singularity, views on that are not divided much along tribal political lines (ETA: as you acknowledge), and LessWrong seems much better placed to have positive influence there because the topic has received much less attention, because of LessWrong’s strong connection (sociological if nothing else) with the Singularity Institute, and because it’s a lot more likely to amount to an existential risk in the eyes of most people here of any political persuasion, though again let’s not discuss whether they’re right.
The point of politics is the mind-killer is that one shouldn’t use politically-charged examples when they’re not on-topic. This is exactly that case. The article is not about global warming, so it should not make mention of global warming, because that topic makes some people go insane.
This does not mean that there cannot be a post about global warming (to the extent that it’s on-topic for the site).
Also, “will” may be the wrong concept.
How about “not enough people with the power to change things see sufficient reasons to so”?
Basic science isn’t political here. Things like “Humans cause global warming; There is no God; Humans evolved from apes” are politicized some places but here they are just premises. There is no need to drag in political baggage by making “This Is Politics!” declarations in cases like this.
Do you see how the claim that “humans cause global warming” differs from the claim quoted in the grandparent comment?
It takes the form of a premise in a sentence that goes on to use it as an illustration of another known problem that isn’t realistically going to be solved any time soon. I don’t think I accept your implication here and in my judgement you introduced politics, not the opening post and so it is your comment that I would wish to see discouraged.
That’s hardly gratuitous. Don’t fall prey to the “‘politics is the mind killer’ is the mind killer” effect. Not all mentions of Hitler are Godwinations.
I would have been less bothered if the comparison gave insight into the structure of the problem beyond just the claim that solutions exist and people don’t have the will to implement them, and/or if it had been presented as opinion rather than fact.
Why? By any reasonable definition, it is fact. We shouldn’t step away from essentially proven facts just because some people make political controversies out of them. In fact these are the examples we should be bringing up more, if we want to be rational when it’s harder, not just rational when it’s easy.
and we can read LW articles while standing on our heads to make it even harder!
I general, it does not seem like a good idea to make your ideas artificially hard to understand.
What exactly does “being rational” mean in this context? Rationality is a way to come to the right conclusions from the available data. If you show the data and how you reached the conclusion, you have shown rationality (assuming there is no lower-level problem, for example that you have previously filtered the data). If you only show the conclusion—well, even if it happens to be the right conclusion, you didn’t demonstrate that you have achieved it rationally.
The mere fact that someone is saying some conslusion is not a proof of rationality. It may be a wrong conclusion, but even if it is the right conclusion, it is a very weak evidence towards author’s rationality, because they might as well just profess their group beliefs. And such as people usually are, when someone is saying a conclusion without showing how they reached them, I would put a high prior probability on them professing group beliefs.
There is no utility in “trying harder” per se, only the results matter. If we want to increase the general sanity waterline, we should do things that increase the change of success, not the harder things. What exactly are we trying to do? If we are trying to signal to people with the same opinions, we could write on the LW homepage with big letters: “global warming is true, parallel universes exist, science is broken, and if you don’t believe this, you are not rational enough”—but what exactly would that achieve? I don’t think it would attract people who want to study rationality.
Choose you battles wisely. Talk about global warming when the global warming is the topic of discussion. Same about parallel universes, etc. Imagine going to a global warming conference and talking about parallel universes—does this fit under the label “being rational when it’s harder”?
My phrasing “when it’s harder, not just rational when it’s easy” was poor. Let me make my points another way.
First of all, do you believe that “But as with the problem of global warming and its known solutions, what we lack is the will to change things” is incorrect? Because I’ve seen very few people objecting, just people arguing that “other people” may find the phrasing disturbing, or objectionable, or whatever. If you object, say so. The audience is the less wrong crowd; if they reject the rest of the post over that one sentence, then what exactly are they doing at this website?
Parallel universes requires a long meta explanation before people can even grasp your point, and, more damningly, they are rejected by experts in the field. Yes, the experts are most likely wrong, but it takes a lot of effort to see that. If someone says “I don’t believe in Everett branches and parallel universes”, I don’t conclude they are being irrational, just that they haven’t been exposed to all the arguments in excruciating detail, or are following a—generally correct—“defer to the scientific consensus” heuristic.
But if someone rejects the global warming consensus, then they are being irrational, and this should be proclaimed, again and again. No self-censorship because some people find it “controversial”.
I am not very good at estimating probabilities, but I would guess: 99% there is a global warming; 95% the human contribution is very significant; 95% in a rational world we could reduce the human contribution, though not necessarily to zero.
Climate change also requires some investigation. As an example, I have never studied anything remotely similar to climatology, and I have no idea who the experts in the field are. (I could do this, but I have limited time and different priorities.) People are giving me all kinds of data, many of them falsified, and I don’t have a background knowledge to tell the difference. So basicly in my situation, all I have is hearsay, and it’s just my decision whom to trust. (Unless I want to ignore my other priorities and invest a lot of time in this topic, which has no practical relevance to my everyday life.)
Despite all this, during years I have done some intuitive version of probabilistic reasoning; I have unconsciously noticed that some things correlate with other things (for example: people who are wrong when discussing one topic have somewhat higher probability to be wrong when discussing other topic, some styles of discussion are somewhat more probably used by people who are wrong, etc.), so gradually my model of the world started strongly suggesting that “there is a global warming” is a true statement. Yet, it is all very indirect reasoning on my part—so I can understand how a person, just as ignorant about this topic as me, could with some probability come to a different conclusion.
No one is perfectly rational, right? People make all kinds of transgressions against rationality, and “rejecting the global warming consensus” seems to me like a minor one, compared with alternatives. Such person could still be in the top 1 percentile of rationality, mostly because humans generally are not very rational.
Anyway, the choice (at least as I see it) is not between “speak about global warming” or “not speak about global warming”, but between “speak about global warming in a separate article, with arguments and references” and “drop the mention in unrelated places, as applause lights”. Some people consider this approach bad even when it is about theism, which in my opinion is a hundred times larger transgression against rationality.
Writing about global warming is a good thing to do, and it belongs on LW, and avoiding it would be bad. It just should be done in a way that emphasises that we speak about rational conclusions, and not only promote our group-think. Because it is a topic where most people promote some group-think, so when this topic is introduced, there is a high prior probability that is was introduced for bad reasons.
Thanks for your detailed response!
I feel the opposite—global warming denial is much worse than (mild) theism. I explain more in: http://lesswrong.com/r/discussion/lw/aw6/global_warming_is_a_better_test_of_irrationality/
And yet it leads you to a 99% probability assignment. :-/
Because it is a lot of indirect reasoning. Literally, decades of occasional information. Even weak patterns can become visible after enough exposure. I have learned even before finding LW that underconfidence is also a sin.
As an analogy: if you throw a coin 10 times, and one side comes up 6 times and other side 4 times, it does not mean much. But if you throw the same coin 1000 times, and one side comes up 600 times and other side 400 times, the coin is almost surely not fair. After many observations you see something that was not visible after few observations.
And just like I cannot throw the same coin 10 more times to convince you that it is not fair (you would have to either see all 1000 experiments, or strongly trust my rationality), there is nothing I could write in this comment to justify my probability assignment. I can only point to the indirect evidence: one relatively stronger data point would be the relative consensus of LW contributors.
Sure, lots of pieces of weak evidence can add up to strong evidence… provided they’re practically independent from each other. And since this issue gets entangled with Green vs Blue politics, the correlation between the various pieces of weak evidence might not be that small. (If the coin was always flipped by the same person, who always allowed to look which side faced which way before flipping it, they could well have used a method of flipping which systematically favoured a certain side—E.T. Jaynes’s book describes some such methods.)
Or your honesty.
That is, if you say to me “I flipped this coin 1000 times and recorded the results in this Excel spreadsheet, which shows 600 heads and 400 tails,” all I have to believe is that you really did flip the coin 1000 times and record the results. That assumes you’re honest, but sets a pretty low lower bound for your rationality.