Okay, let me clarify: the problem of unproductive argument stems from the reality that people are a: bad truth finders, b: usually don’t care to find truth and c: are prone to backwards thought from proposition to justifications, which is acceptable [because of limited computing power and difficulty to do it other way around].
The tip is awesome when you are right (and I totally agree that it is great to have references and so on). When you are wrong, which is more than half of the problem (as much of the time BOTH sides are wrong), it is extremely obtuse. I’d rather prefer people dump out something closer to why they actually believe the argument, rather than how they justify them. Yes, that makes for poor show, but it is more truthful. Why you believe something, is [often] not accurate citation. It is [often] the poor paraphrasing.
Just look at the ‘tips’ for productive arguments. Is there a tip number 1: drop your position ASAP if you are wrong? Hell frigging no (not that it would work either, though, that’s not how arguing ever works).
edit: to clarify more. Consider climate debates. Those are terrible. Now, you can have naive honest folk who says he ain’t trusting no climate scientist. You can have naive honest folk who says, he ain’t trusting no oil company. And you can have two pseudo-climate-scientific dudes, arguing by obtusely citing studies at each other, not understanding a single thing about the climate modelling, generating a lot of noise that looks good but neither of them would ever change the view even if they seen all the studies they citing in exact same light. But they are merely the sophisticated version of former folks, who hide their actual beliefs. The cranks that make up some form of crank climate theory, are not as bad as those two types of climate-arguing folks. The former folks talking about politics, they generate some argument, they won’t agree because one’s authoritarian and other liberal, but they at least make that clear. The cranks, they generate cranky theories. The citingpeople, they generate pure deception as to who they are.
Just look at the ‘tips’ for productive arguments. Is there a tip number 1: drop your position ASAP if you are wrong? Hell frigging no (not that it would work either, though, that’s not how arguing ever works).
I’ve done my best to make this a habit, and it really isn’t that hard to do, especially over the internet. Once you ‘bite the bullet’ the first time it seems to get easier to do in the future. I’ve even been able to concede points of contention in real life (when appropriate). Is it automatic? No, you have to keep it in the back of your mind, just like you have to keep in mind the possibility that you’re rationalizing. You also have to act on it which, for me, does seem to get easier the more I do it.
The tip is awesome when you are right (and I totally agree that it is great to have references and so on). When you are wrong, which is more than half of the problem (as much of the time BOTH sides are wrong), it is extremely obtuse. I’d rather prefer people dump out something closer to why they actually believe the argument, rather than how they justify them. Yes, that makes for poor show, but it is more truthful. Why you believe something, is [often] not accurate citation. It is [often] the poor paraphrasing.
This sort of goes with the need to constantly try to recognize when you are rationalizing. If you are looking up a storm of quotes, articles, posts etc. to back up your point and overwhelm your ‘opponent’, this should set off alarm bells. The problem is that those who spend a lot of time correcting people who are obviously wrong by providing them with large amounts of correct information also seem prone to taking the same approach to a position that merely seems obviously wrong for reasons they might not be totally conscious of themselves. They then engage in some rapid fire confirmation bias, throw a bunch of links up and try to ‘overpower the opponent’. This is something to be aware of. If the position you’re engaging seems wrong but you don’t have a clear-cut, well evidenced reason why this is, you should take some time to consider why you want it to be right.
When facing someone who is engaging in this behavior (perhaps they are dismissing something you think is sensible, be it strong AI or cryonics, or existential risk, what have you) there are some heuristics you can use. In online debates in particular, I can usually figure out pretty quickly if the other person understands the citations they make by choosing one they seem to place some emphasis on and looking at it carefully, then posing questions about the details.
I’ve found that you can usually press the ‘citingpeople’ into revealing their underlying motivations in a variety of ways. One way is sort of poor—simply guess at their motivations and suggest that as truth. They will feel the need to defend their motivations and clarify. The major drawback is that this can also shut down the discussion. An alternative is to suggest a good-sounding motivation as truth—this doesn’t feel like an attack, and they may engage it. The drawback is that this may encourage them to take up the suggested motivation as their own. At this point, some of their citations will likely not be in line with their adopted position, but pointing this out can cause backtracking and can also shut down discussion if pressed. Neither approach guarantees us clear insight into the motivations of the other person, but the latter can be a good heuristic (akin to the ’steel man suggestion). Really, I can’t think of a cut-and-dried solution to situations in which people try to build up a wall of citations—each situation I can think of required a different approach depending on the nature of the position and the attitude of the other person.
Anyway, I think that in the context of all of the other suggestions and the basic etiquette at LW, the suggestions are fine, and the situation you’re worried about would typically only obtain if someone were cherry picking a few of these ideas without making effort to adjust their way of thinking. Recognizing your motivation for wanting something to be true is an important step in recognizing when you’re defending a position for poor reasons, and this motivation should be presented upfront whenever possible (this also allows the other person to more easily pinpoint your true rejection).
One should either cite the prevailing scientific opinion (e.g. on global warming), or present a novel scientific argument (where you cite the data you use). Other stuff really is nonsense. You can’t usefully second-guess science. Citing studies that support your opinion is cherry picking, and is bad.
Consider a drug trial; there were 2000 cases where drug did better than placebo, and 500 cases where it did worse. If each trial was a study, the wikipedia page would likely link to 20 links showing that it did better than placebo, including the meta-study, and 20 that it did worse. If it was edited to have 40 links that it did better, it’ll have 40 links that it did worse. How silly is the debate, where people just cite the cases they pick? Pointlessly silly.
On top of that people (outside lesswrong mostly) really don’t understand how to process scientific studies. If there is a calculation that CO2 causes warming, then if calculation is not incorrect, or some very basic physics is not incorrect, CO2 does cause warming. There’s no ‘countering’ of this study. The effect won’t go anywhere, what ever you do. The only thing one could do is to argue that CO2 somehow also causes cooling; an entirely new mechanism. E.g. if snow was black, rather than white, and ground was white rather than dark, one could argue that warming removes the snow, leading in decrease in absorption, and decreasing the impact of the warming. Alas snow is white and ground is dark, so warming does cause further warming via this mechanics, and the only thing you can do is to come up with some other mechanism here that does the opposite. And so on. (You could disprove those by e.g. finding that snow, really, is dark, and ground, really, is white., or by finding that CO2 doesn’t really absorb IR, but that’s it).
People don’t understand difference between calculating predictions, and just free-form hypothesising that may well be wrong, and needs to be tested with experiment, etc etc.
(i choose global warming because I trust it is not a controversial issue on LW, but I do want something that is generally controversial and not so crazy as to not be believed by anyone)
If there is a calculation that CO2 causes warming, then if calculation is not incorrect, or some very basic physics is not incorrect, CO2 does cause warming.
It might very well be possible that the calculation is correct, and the basic physics is correct, and yet an increase in CO2 emissions does not lead to warming—because there’s some mechanism that simultaneously increases CO2 absorption, or causes cooling (as you said, though in a less counterfactual way), or whatever. It could also be possible that your measurements of CO2 levels were incorrect.
Thus, you could—hypothetically -- “counter” the study (in this scenario) by revealing the errors in the measurements, or by demonstrating additional mechanisms that invalidate the end effects.
If there was a mechanism that simultaneously increased CO2 absorption, the levels wouldn’t have been rising. For the measurements, you mean, like vast conspiracy that over reports the coal that is being burnt? Yes, that is possible, of course.
One shouldn’t do motivated search, though. There is a zillion other mechanisms going on, of course, that increase, and decrease the effects. All the immediately obvious ones amplify the effect (e.g. warming releases CO2 and methane from all kinds of sources where it is dissolved; the snow is white and melts earlier in spring, etc). Of course, if one is to start doing motivated search either way, one could remain ignorant of those and collect the ones that work in opposite, and successfully ‘counter’ the warming. But that’s cherry picking. If one is to just look around and report on what one sees there is a giant number of amplifying mechanisms, and few if any opposite mechanisms; which depend on the temperature and are thus incapable of entirely negating the warming because they need warming to work.
If there was a mechanism that simultaneously increased CO2 absorption, the levels wouldn’t have been rising.
I was thinking of a scenario where you measured CO2 emissions, but forgot to measure absorption (I acknowledge that such a scenario is contrived, but I think you get the idea).
For the measurements, you mean, like vast conspiracy that over reports the coal that is being burnt?
That’s a possibility as well, but I was thinking about more innocuous things like sample contamination, malfunctioning GPS cables, etc.
In all of these cases, your math is correct, and your basic physics is correct, and yet the conclusion is still wrong.
Well, I mentioned one of measurements—with regards to how well CO2 absorbs the infrared—as example. The measurements for inputs on basic physics are pretty damn well verified though, and rarely contested. If one is so concerned about such basic stuff, one shouldn’t be using the technology anyway.
It’s the selective trust that is a problem—you trust that plastic smell in car won’t kill you in 15 years but you don’t trust the scientists on warming. Amish global warming denialists aren’t really a problem; the technophilic civilization that doesn’t trust scientists only when they say something uncomfortable, is.
edit:
Anyhow, what you get in AGW debate, is citing of studies that aren’t refuting each other in the slightest; the anti-warming side just cites some low grade stuff like climate measurements, which can at most prove e.g. that sun is dimming, or that we are also doing something which cools the planet (e.g. airplane contrails seed clouds, and the effect is not tiny), but we don’t know what it is. The pro-warming side, though, typically, doesn’t even understand that it got uncontested evidence. Majority of debates, both sides are wrong, one is correct about the fact simply due to luck, but not because the fact has made it, causally, to hold the view.
The measurements for inputs on basic physics are pretty damn well verified though, and rarely contested.
That’s true, but the model is more complex than “CO2 absorbs infrared, therefore global warming”. It’s closer to something like, “CO2 absorbs infrared, CO2 is produced faster than it is consumed, mitigating factors are insufficient, therefore global warming”; and in reality it’s probably more complex than that. So, it’s not enough to just measure some basic physical properties of CO2; you must also measure its actual concentration in the atmosphere, the rate of change of this concentration, etc.
the technophilic civilization that doesn’t trust scientists only when they say something uncomfortable, is [the problem].
Here you and I agree.
both sides are wrong, one is correct about the fact simply due to luck, but not because the fact has made it, causally, to hold the view.
I think you’re being a bit harsh here. Surely, not all the scientists are just rolling dice in the dark, so to speak ? If scientific consensus was correct primarily “due to luck”, we probably wouldn’t have gotten as far as we did in our understanding of the world...
re: the model, it gets extremely complex when you want to answer a question: does it absolutely positively compel me to change my view about warming.
It doesn’t need to get that complex when you try to maximize expected future utility. We are, actually, pretty good at measuring stuff. And fairly outrageous things need to happen to break that.
I think you’re being a bit harsh here. Surely, not all the scientists are just rolling dice in the dark, so to speak ?
I’m not speaking of scientists. I’m speaking of people arguing. Not that there’s all that much wrong with it—after all, the folks who deny the global warming, they have to be convinced somehow, and they are immune to simple reasonable argument WRT the scientific consensus. No, they want to second-guess science, even though they never studied anything relevant outside the climate related discussion.
I’m speaking of people arguing. Not that there’s all that much wrong with it—after all, the folks who deny the global warming, they have to be convinced somehow, and they are immune to simple reasonable argument WRT the scientific consensus. No, they want to second-guess science, even though they never studied anything relevant outside the climate related discussion.
I’m a tad confused. Earlier you were against people using the information they don’t fully understand yet happens to be true, but here you seem to be suggesting that this isn’t so bad and has a useful purpose—convincing people who deny global warming because they don’t trust the science.
Would you be amenable to the position that sometimes it is OK to purposely direct people to adopt your point of view if it has a certain level of clear support, even if those people leave not fully understanding why the position is correct? I.e. is it sometimes good to promote “guessing the teacher’s password” in the interest of minimizing risk/damages?
Well, I said it was irritating to see, especially if it doesn’t work to convince anyone. If it works, well, the utility of e.g. changing the attitudes can exceed dis-utility of it being annoying. It’s interesting how if one is to try to apply utilitarian reasoning it is immediately interpreted as ‘inconsistent’. May be why we are so bad at it—other’s opinions matter.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
The whole point of studying formal epistemology and debiasing (major topics on this site) is to build the skill of picking out which ideas are more likely to be correct given the evidence. This should always be worked on in the background, and you should only be applying these tips in the context of a sound and consistent epistemology. So really, this problem should fall on the user of these tips—it’s their responsibility to adhere to sound epistemic standards when conveying information.
As far as the issue of changing minds—there is sort of a continuum here, for instance I might have a great deal of strong evidence for something like, say, evolution. Yet there will be people for whom the inferential distance is too great to span in the course of a single discussion—“well, it’s just a theory”, “you can’t prove it” etc.
Relevant to the climate example, a friend of mine who is doing his doctorate in environmental engineering at Yale was speaking to the relative of a friend who is sort of a ‘naive’ climate change denier—he has no grasp of how scientific data works nor does he have any preferred alternative theory he’s invested in. He’s more like the “well it’s cold out now, so how do you explain that?” sort. My friend tried to explain attractors and long term prediction methods, but this was ineffective. Eventually he pointed out how warm the winter has been unusually this year, and that made him think a bit about it. So he exploited the other person’s views to defend his position. However, it didn’t correct the other person’s epistemology at all, and left him with an equally wrong impression of the issue.
The problem with his approach (and really, in his defense, he was just looking to end the conversation) is that should that person learn a bit more about it, he will realize that he was deceived and will remember that the deceiver was a “global warming believer”. In this particular case, that isn’t likely (he almost certainly will not go and study up on climate science), but it illustrates a general danger in presenting a false picture in order to vault inferential distance.
It seems like the key is to first assess the level of inferential distance between you and the other person, and craft your explanation appropriately. The difficult part is doing so without setting the person up to feel cheated once they shorten the inferential distance a bit.
So, the difficulty isn’t just in making it work better for correct positions (which has its own set of suggestions, like studying statistics and (good) philosophy of science), but also being extremely careful when presenting intermediate stories that aren’t quite right. This latter issue disappears if the other person has close to the same background knowledge as you, and you’re right that in such cases it can become fairly easy to argue for something that is wrong, and even easier to argue for something that isn’t as well settled as you think it is (probably the bigger danger of the two), leading you to misrepresent the strength of your claim. I think this latter issue is much ‘stickier’ and particularly relevant to LW, where you see people who appear to be extremely confident in certain core claims yet appear to have a questionable ability to defend them (often opting to link to posts in the sequences, which is fine if you’ve really taken the time to work out the details, but this isn’t always the case).
You mean like the fact that clouds are white and form more when it’s warmer.
Do they, really? Last time I checked they formed pretty well at −20c and at +35c . Ohh, i see knee jerk reaction happening—they may form a bit more at +35c in your place (here they are white, and also form more in winter). Okay, 55 degrees of difference may make a difference, now what?
There comes another common failure mode: animism. Even if you find temperature dependent effects that are opposite, they have to be quite strong to produce any notable difference of temperature as a result of 2 degree difference in temperature, at the many points of the temperature range, to get yourself any compensation beyond small %. It’s only the biological systems, that tend to implement PID controllers, which do counter any deviations from equilibrium, even little ones, in a way not dependent on their magnitude.
The way I’ve always heard it, mainstream estimates of climate sensitivity are somewhere around 3 degrees (with a fair amount of spread), and the direct effect of CO2 on radiation is responsible for 1 degree of that, with the rest being caused by positive feedbacks. It may be possible to argue that some important positive feedbacks are also basic physics (and that no important negative feedbacks are basic physics), but it sounds to me like that’s not what you’re doing; it sounds to me like, instead, you’re mistakenly claiming that the direct effect by itself, without any feedback effects, is enough to cause warming similar to that claimed by mainstream estimates.
Nah, I’m speaking of the anthropogenic global warming vs no anthropogenic global warming ‘debate’, not of 1 degree vs 3 degrees type debate. For the most part, the AGW debate is focussed on the effect of CO2, sans the positive feedbacks, as the deniers won’t even accept 1 degree of difference.
Speaking of which, one very huge positive feedback is that water vapour is a greenhouse ‘gas’.
Why the quotes? Water vapor’s a gas. There’s also liquid- and solid-phase water in the atmosphere in the form of clouds and haze, but my understanding is that that generally has a cooling effect by way of increasing albedo.
Might be missing some feedbacks there, though; I’m not a climatologist.
Well, thats why quotes, because it is changing phase there. The clouds effect on climate btw is not so simple; the clouds also reflect the infrared some.
Hm, I think higher up the hierarchy of abstraction is generally bad, when it comes to disagreements. People so easily get trapped into arguing because someone else is arguing back, and it’s even easier when you’re not being concrete.
Ah, okay. I still think you have to be careful of degenerating into bad stuff anyhow—if the argument becomes about cherry-picking rather than the evidence, that could be worse than arguing without those sources.
Which one on the list is appeal to authority or quotation of a piece of text one is not himself qualified to understand? (i only briefly skimmed and didn’t really see it). (Looks like DH1 is the only one mentioning references to authorities, in the way of accusation of lack of authority).
DH4, argument. Pointing out what authorities say on the question is contradiction (the authorities contradict your claim) plus evidence (which authorities where).
Cherry picking, combined with typically putting words into authorities mouths. But I agree that if it is an accepted consensus rather than cherry-picked authorities, then it’s pretty effective. (edit: Unfortunately of course, one probably knows of the consensus long before the argument)
So in other words, this strategy degenerates into several steps higher up the hierarchy of disagreement than just about every other online argument...
Okay, let me clarify: the problem of unproductive argument stems from the reality that people are a: bad truth finders, b: usually don’t care to find truth and c: are prone to backwards thought from proposition to justifications, which is acceptable [because of limited computing power and difficulty to do it other way around].
The tip is awesome when you are right (and I totally agree that it is great to have references and so on). When you are wrong, which is more than half of the problem (as much of the time BOTH sides are wrong), it is extremely obtuse. I’d rather prefer people dump out something closer to why they actually believe the argument, rather than how they justify them. Yes, that makes for poor show, but it is more truthful. Why you believe something, is [often] not accurate citation. It is [often] the poor paraphrasing.
Just look at the ‘tips’ for productive arguments. Is there a tip number 1: drop your position ASAP if you are wrong? Hell frigging no (not that it would work either, though, that’s not how arguing ever works).
edit: to clarify more. Consider climate debates. Those are terrible. Now, you can have naive honest folk who says he ain’t trusting no climate scientist. You can have naive honest folk who says, he ain’t trusting no oil company. And you can have two pseudo-climate-scientific dudes, arguing by obtusely citing studies at each other, not understanding a single thing about the climate modelling, generating a lot of noise that looks good but neither of them would ever change the view even if they seen all the studies they citing in exact same light. But they are merely the sophisticated version of former folks, who hide their actual beliefs. The cranks that make up some form of crank climate theory, are not as bad as those two types of climate-arguing folks. The former folks talking about politics, they generate some argument, they won’t agree because one’s authoritarian and other liberal, but they at least make that clear. The cranks, they generate cranky theories. The citingpeople, they generate pure deception as to who they are.
I’ve done my best to make this a habit, and it really isn’t that hard to do, especially over the internet. Once you ‘bite the bullet’ the first time it seems to get easier to do in the future. I’ve even been able to concede points of contention in real life (when appropriate). Is it automatic? No, you have to keep it in the back of your mind, just like you have to keep in mind the possibility that you’re rationalizing. You also have to act on it which, for me, does seem to get easier the more I do it.
This sort of goes with the need to constantly try to recognize when you are rationalizing. If you are looking up a storm of quotes, articles, posts etc. to back up your point and overwhelm your ‘opponent’, this should set off alarm bells. The problem is that those who spend a lot of time correcting people who are obviously wrong by providing them with large amounts of correct information also seem prone to taking the same approach to a position that merely seems obviously wrong for reasons they might not be totally conscious of themselves. They then engage in some rapid fire confirmation bias, throw a bunch of links up and try to ‘overpower the opponent’. This is something to be aware of. If the position you’re engaging seems wrong but you don’t have a clear-cut, well evidenced reason why this is, you should take some time to consider why you want it to be right.
When facing someone who is engaging in this behavior (perhaps they are dismissing something you think is sensible, be it strong AI or cryonics, or existential risk, what have you) there are some heuristics you can use. In online debates in particular, I can usually figure out pretty quickly if the other person understands the citations they make by choosing one they seem to place some emphasis on and looking at it carefully, then posing questions about the details.
I’ve found that you can usually press the ‘citingpeople’ into revealing their underlying motivations in a variety of ways. One way is sort of poor—simply guess at their motivations and suggest that as truth. They will feel the need to defend their motivations and clarify. The major drawback is that this can also shut down the discussion. An alternative is to suggest a good-sounding motivation as truth—this doesn’t feel like an attack, and they may engage it. The drawback is that this may encourage them to take up the suggested motivation as their own. At this point, some of their citations will likely not be in line with their adopted position, but pointing this out can cause backtracking and can also shut down discussion if pressed. Neither approach guarantees us clear insight into the motivations of the other person, but the latter can be a good heuristic (akin to the ’steel man suggestion). Really, I can’t think of a cut-and-dried solution to situations in which people try to build up a wall of citations—each situation I can think of required a different approach depending on the nature of the position and the attitude of the other person.
Anyway, I think that in the context of all of the other suggestions and the basic etiquette at LW, the suggestions are fine, and the situation you’re worried about would typically only obtain if someone were cherry picking a few of these ideas without making effort to adjust their way of thinking. Recognizing your motivation for wanting something to be true is an important step in recognizing when you’re defending a position for poor reasons, and this motivation should be presented upfront whenever possible (this also allows the other person to more easily pinpoint your true rejection).
One should either cite the prevailing scientific opinion (e.g. on global warming), or present a novel scientific argument (where you cite the data you use). Other stuff really is nonsense. You can’t usefully second-guess science. Citing studies that support your opinion is cherry picking, and is bad.
Consider a drug trial; there were 2000 cases where drug did better than placebo, and 500 cases where it did worse. If each trial was a study, the wikipedia page would likely link to 20 links showing that it did better than placebo, including the meta-study, and 20 that it did worse. If it was edited to have 40 links that it did better, it’ll have 40 links that it did worse. How silly is the debate, where people just cite the cases they pick? Pointlessly silly.
On top of that people (outside lesswrong mostly) really don’t understand how to process scientific studies. If there is a calculation that CO2 causes warming, then if calculation is not incorrect, or some very basic physics is not incorrect, CO2 does cause warming. There’s no ‘countering’ of this study. The effect won’t go anywhere, what ever you do. The only thing one could do is to argue that CO2 somehow also causes cooling; an entirely new mechanism. E.g. if snow was black, rather than white, and ground was white rather than dark, one could argue that warming removes the snow, leading in decrease in absorption, and decreasing the impact of the warming. Alas snow is white and ground is dark, so warming does cause further warming via this mechanics, and the only thing you can do is to come up with some other mechanism here that does the opposite. And so on. (You could disprove those by e.g. finding that snow, really, is dark, and ground, really, is white., or by finding that CO2 doesn’t really absorb IR, but that’s it).
People don’t understand difference between calculating predictions, and just free-form hypothesising that may well be wrong, and needs to be tested with experiment, etc etc.
(i choose global warming because I trust it is not a controversial issue on LW, but I do want something that is generally controversial and not so crazy as to not be believed by anyone)
It might very well be possible that the calculation is correct, and the basic physics is correct, and yet an increase in CO2 emissions does not lead to warming—because there’s some mechanism that simultaneously increases CO2 absorption, or causes cooling (as you said, though in a less counterfactual way), or whatever. It could also be possible that your measurements of CO2 levels were incorrect.
Thus, you could—hypothetically -- “counter” the study (in this scenario) by revealing the errors in the measurements, or by demonstrating additional mechanisms that invalidate the end effects.
If there was a mechanism that simultaneously increased CO2 absorption, the levels wouldn’t have been rising. For the measurements, you mean, like vast conspiracy that over reports the coal that is being burnt? Yes, that is possible, of course.
One shouldn’t do motivated search, though. There is a zillion other mechanisms going on, of course, that increase, and decrease the effects. All the immediately obvious ones amplify the effect (e.g. warming releases CO2 and methane from all kinds of sources where it is dissolved; the snow is white and melts earlier in spring, etc). Of course, if one is to start doing motivated search either way, one could remain ignorant of those and collect the ones that work in opposite, and successfully ‘counter’ the warming. But that’s cherry picking. If one is to just look around and report on what one sees there is a giant number of amplifying mechanisms, and few if any opposite mechanisms; which depend on the temperature and are thus incapable of entirely negating the warming because they need warming to work.
I was thinking of a scenario where you measured CO2 emissions, but forgot to measure absorption (I acknowledge that such a scenario is contrived, but I think you get the idea).
That’s a possibility as well, but I was thinking about more innocuous things like sample contamination, malfunctioning GPS cables, etc.
In all of these cases, your math is correct, and your basic physics is correct, and yet the conclusion is still wrong.
Well, I mentioned one of measurements—with regards to how well CO2 absorbs the infrared—as example. The measurements for inputs on basic physics are pretty damn well verified though, and rarely contested. If one is so concerned about such basic stuff, one shouldn’t be using the technology anyway.
It’s the selective trust that is a problem—you trust that plastic smell in car won’t kill you in 15 years but you don’t trust the scientists on warming. Amish global warming denialists aren’t really a problem; the technophilic civilization that doesn’t trust scientists only when they say something uncomfortable, is.
edit:
Anyhow, what you get in AGW debate, is citing of studies that aren’t refuting each other in the slightest; the anti-warming side just cites some low grade stuff like climate measurements, which can at most prove e.g. that sun is dimming, or that we are also doing something which cools the planet (e.g. airplane contrails seed clouds, and the effect is not tiny), but we don’t know what it is. The pro-warming side, though, typically, doesn’t even understand that it got uncontested evidence. Majority of debates, both sides are wrong, one is correct about the fact simply due to luck, but not because the fact has made it, causally, to hold the view.
That’s true, but the model is more complex than “CO2 absorbs infrared, therefore global warming”. It’s closer to something like, “CO2 absorbs infrared, CO2 is produced faster than it is consumed, mitigating factors are insufficient, therefore global warming”; and in reality it’s probably more complex than that. So, it’s not enough to just measure some basic physical properties of CO2; you must also measure its actual concentration in the atmosphere, the rate of change of this concentration, etc.
Here you and I agree.
I think you’re being a bit harsh here. Surely, not all the scientists are just rolling dice in the dark, so to speak ? If scientific consensus was correct primarily “due to luck”, we probably wouldn’t have gotten as far as we did in our understanding of the world...
re: the model, it gets extremely complex when you want to answer a question: does it absolutely positively compel me to change my view about warming.
It doesn’t need to get that complex when you try to maximize expected future utility. We are, actually, pretty good at measuring stuff. And fairly outrageous things need to happen to break that.
I’m not speaking of scientists. I’m speaking of people arguing. Not that there’s all that much wrong with it—after all, the folks who deny the global warming, they have to be convinced somehow, and they are immune to simple reasonable argument WRT the scientific consensus. No, they want to second-guess science, even though they never studied anything relevant outside the climate related discussion.
I’m a tad confused. Earlier you were against people using the information they don’t fully understand yet happens to be true, but here you seem to be suggesting that this isn’t so bad and has a useful purpose—convincing people who deny global warming because they don’t trust the science.
Would you be amenable to the position that sometimes it is OK to purposely direct people to adopt your point of view if it has a certain level of clear support, even if those people leave not fully understanding why the position is correct? I.e. is it sometimes good to promote “guessing the teacher’s password” in the interest of minimizing risk/damages?
Well, I said it was irritating to see, especially if it doesn’t work to convince anyone. If it works, well, the utility of e.g. changing the attitudes can exceed dis-utility of it being annoying. It’s interesting how if one is to try to apply utilitarian reasoning it is immediately interpreted as ‘inconsistent’. May be why we are so bad at it—other’s opinions matter.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
The whole point of studying formal epistemology and debiasing (major topics on this site) is to build the skill of picking out which ideas are more likely to be correct given the evidence. This should always be worked on in the background, and you should only be applying these tips in the context of a sound and consistent epistemology. So really, this problem should fall on the user of these tips—it’s their responsibility to adhere to sound epistemic standards when conveying information.
As far as the issue of changing minds—there is sort of a continuum here, for instance I might have a great deal of strong evidence for something like, say, evolution. Yet there will be people for whom the inferential distance is too great to span in the course of a single discussion—“well, it’s just a theory”, “you can’t prove it” etc.
Relevant to the climate example, a friend of mine who is doing his doctorate in environmental engineering at Yale was speaking to the relative of a friend who is sort of a ‘naive’ climate change denier—he has no grasp of how scientific data works nor does he have any preferred alternative theory he’s invested in. He’s more like the “well it’s cold out now, so how do you explain that?” sort. My friend tried to explain attractors and long term prediction methods, but this was ineffective. Eventually he pointed out how warm the winter has been unusually this year, and that made him think a bit about it. So he exploited the other person’s views to defend his position. However, it didn’t correct the other person’s epistemology at all, and left him with an equally wrong impression of the issue.
The problem with his approach (and really, in his defense, he was just looking to end the conversation) is that should that person learn a bit more about it, he will realize that he was deceived and will remember that the deceiver was a “global warming believer”. In this particular case, that isn’t likely (he almost certainly will not go and study up on climate science), but it illustrates a general danger in presenting a false picture in order to vault inferential distance.
It seems like the key is to first assess the level of inferential distance between you and the other person, and craft your explanation appropriately. The difficult part is doing so without setting the person up to feel cheated once they shorten the inferential distance a bit.
So, the difficulty isn’t just in making it work better for correct positions (which has its own set of suggestions, like studying statistics and (good) philosophy of science), but also being extremely careful when presenting intermediate stories that aren’t quite right. This latter issue disappears if the other person has close to the same background knowledge as you, and you’re right that in such cases it can become fairly easy to argue for something that is wrong, and even easier to argue for something that isn’t as well settled as you think it is (probably the bigger danger of the two), leading you to misrepresent the strength of your claim. I think this latter issue is much ‘stickier’ and particularly relevant to LW, where you see people who appear to be extremely confident in certain core claims yet appear to have a questionable ability to defend them (often opting to link to posts in the sequences, which is fine if you’ve really taken the time to work out the details, but this isn’t always the case).
You mean like the fact that clouds are white and form more when it’s warmer.
Do they, really? Last time I checked they formed pretty well at −20c and at +35c . Ohh, i see knee jerk reaction happening—they may form a bit more at +35c in your place (here they are white, and also form more in winter). Okay, 55 degrees of difference may make a difference, now what?
There comes another common failure mode: animism. Even if you find temperature dependent effects that are opposite, they have to be quite strong to produce any notable difference of temperature as a result of 2 degree difference in temperature, at the many points of the temperature range, to get yourself any compensation beyond small %. It’s only the biological systems, that tend to implement PID controllers, which do counter any deviations from equilibrium, even little ones, in a way not dependent on their magnitude.
The way I’ve always heard it, mainstream estimates of climate sensitivity are somewhere around 3 degrees (with a fair amount of spread), and the direct effect of CO2 on radiation is responsible for 1 degree of that, with the rest being caused by positive feedbacks. It may be possible to argue that some important positive feedbacks are also basic physics (and that no important negative feedbacks are basic physics), but it sounds to me like that’s not what you’re doing; it sounds to me like, instead, you’re mistakenly claiming that the direct effect by itself, without any feedback effects, is enough to cause warming similar to that claimed by mainstream estimates.
Nah, I’m speaking of the anthropogenic global warming vs no anthropogenic global warming ‘debate’, not of 1 degree vs 3 degrees type debate. For the most part, the AGW debate is focussed on the effect of CO2, sans the positive feedbacks, as the deniers won’t even accept 1 degree of difference.
Speaking of which, one very huge positive feedback is that water vapour is a greenhouse ‘gas’.
Why the quotes? Water vapor’s a gas. There’s also liquid- and solid-phase water in the atmosphere in the form of clouds and haze, but my understanding is that that generally has a cooling effect by way of increasing albedo.
Might be missing some feedbacks there, though; I’m not a climatologist.
Well, thats why quotes, because it is changing phase there. The clouds effect on climate btw is not so simple; the clouds also reflect the infrared some.
I think the debate, and certainly the policy debate, is (in effect) about the catastrophic consequences of CO2.
Hm, I think higher up the hierarchy of abstraction is generally bad, when it comes to disagreements. People so easily get trapped into arguing because someone else is arguing back, and it’s even easier when you’re not being concrete.
I didn’t say abstraction, I said disagreement.
Ah, okay. I still think you have to be careful of degenerating into bad stuff anyhow—if the argument becomes about cherry-picking rather than the evidence, that could be worse than arguing without those sources.
Which one on the list is appeal to authority or quotation of a piece of text one is not himself qualified to understand? (i only briefly skimmed and didn’t really see it). (Looks like DH1 is the only one mentioning references to authorities, in the way of accusation of lack of authority).
DH4, argument. Pointing out what authorities say on the question is contradiction (the authorities contradict your claim) plus evidence (which authorities where).
Cherry picking, combined with typically putting words into authorities mouths. But I agree that if it is an accepted consensus rather than cherry-picked authorities, then it’s pretty effective. (edit: Unfortunately of course, one probably knows of the consensus long before the argument)