Well, I mentioned one of measurements—with regards to how well CO2 absorbs the infrared—as example. The measurements for inputs on basic physics are pretty damn well verified though, and rarely contested. If one is so concerned about such basic stuff, one shouldn’t be using the technology anyway.
It’s the selective trust that is a problem—you trust that plastic smell in car won’t kill you in 15 years but you don’t trust the scientists on warming. Amish global warming denialists aren’t really a problem; the technophilic civilization that doesn’t trust scientists only when they say something uncomfortable, is.
edit:
Anyhow, what you get in AGW debate, is citing of studies that aren’t refuting each other in the slightest; the anti-warming side just cites some low grade stuff like climate measurements, which can at most prove e.g. that sun is dimming, or that we are also doing something which cools the planet (e.g. airplane contrails seed clouds, and the effect is not tiny), but we don’t know what it is. The pro-warming side, though, typically, doesn’t even understand that it got uncontested evidence. Majority of debates, both sides are wrong, one is correct about the fact simply due to luck, but not because the fact has made it, causally, to hold the view.
The measurements for inputs on basic physics are pretty damn well verified though, and rarely contested.
That’s true, but the model is more complex than “CO2 absorbs infrared, therefore global warming”. It’s closer to something like, “CO2 absorbs infrared, CO2 is produced faster than it is consumed, mitigating factors are insufficient, therefore global warming”; and in reality it’s probably more complex than that. So, it’s not enough to just measure some basic physical properties of CO2; you must also measure its actual concentration in the atmosphere, the rate of change of this concentration, etc.
the technophilic civilization that doesn’t trust scientists only when they say something uncomfortable, is [the problem].
Here you and I agree.
both sides are wrong, one is correct about the fact simply due to luck, but not because the fact has made it, causally, to hold the view.
I think you’re being a bit harsh here. Surely, not all the scientists are just rolling dice in the dark, so to speak ? If scientific consensus was correct primarily “due to luck”, we probably wouldn’t have gotten as far as we did in our understanding of the world...
re: the model, it gets extremely complex when you want to answer a question: does it absolutely positively compel me to change my view about warming.
It doesn’t need to get that complex when you try to maximize expected future utility. We are, actually, pretty good at measuring stuff. And fairly outrageous things need to happen to break that.
I think you’re being a bit harsh here. Surely, not all the scientists are just rolling dice in the dark, so to speak ?
I’m not speaking of scientists. I’m speaking of people arguing. Not that there’s all that much wrong with it—after all, the folks who deny the global warming, they have to be convinced somehow, and they are immune to simple reasonable argument WRT the scientific consensus. No, they want to second-guess science, even though they never studied anything relevant outside the climate related discussion.
I’m speaking of people arguing. Not that there’s all that much wrong with it—after all, the folks who deny the global warming, they have to be convinced somehow, and they are immune to simple reasonable argument WRT the scientific consensus. No, they want to second-guess science, even though they never studied anything relevant outside the climate related discussion.
I’m a tad confused. Earlier you were against people using the information they don’t fully understand yet happens to be true, but here you seem to be suggesting that this isn’t so bad and has a useful purpose—convincing people who deny global warming because they don’t trust the science.
Would you be amenable to the position that sometimes it is OK to purposely direct people to adopt your point of view if it has a certain level of clear support, even if those people leave not fully understanding why the position is correct? I.e. is it sometimes good to promote “guessing the teacher’s password” in the interest of minimizing risk/damages?
Well, I said it was irritating to see, especially if it doesn’t work to convince anyone. If it works, well, the utility of e.g. changing the attitudes can exceed dis-utility of it being annoying. It’s interesting how if one is to try to apply utilitarian reasoning it is immediately interpreted as ‘inconsistent’. May be why we are so bad at it—other’s opinions matter.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
The whole point of studying formal epistemology and debiasing (major topics on this site) is to build the skill of picking out which ideas are more likely to be correct given the evidence. This should always be worked on in the background, and you should only be applying these tips in the context of a sound and consistent epistemology. So really, this problem should fall on the user of these tips—it’s their responsibility to adhere to sound epistemic standards when conveying information.
As far as the issue of changing minds—there is sort of a continuum here, for instance I might have a great deal of strong evidence for something like, say, evolution. Yet there will be people for whom the inferential distance is too great to span in the course of a single discussion—“well, it’s just a theory”, “you can’t prove it” etc.
Relevant to the climate example, a friend of mine who is doing his doctorate in environmental engineering at Yale was speaking to the relative of a friend who is sort of a ‘naive’ climate change denier—he has no grasp of how scientific data works nor does he have any preferred alternative theory he’s invested in. He’s more like the “well it’s cold out now, so how do you explain that?” sort. My friend tried to explain attractors and long term prediction methods, but this was ineffective. Eventually he pointed out how warm the winter has been unusually this year, and that made him think a bit about it. So he exploited the other person’s views to defend his position. However, it didn’t correct the other person’s epistemology at all, and left him with an equally wrong impression of the issue.
The problem with his approach (and really, in his defense, he was just looking to end the conversation) is that should that person learn a bit more about it, he will realize that he was deceived and will remember that the deceiver was a “global warming believer”. In this particular case, that isn’t likely (he almost certainly will not go and study up on climate science), but it illustrates a general danger in presenting a false picture in order to vault inferential distance.
It seems like the key is to first assess the level of inferential distance between you and the other person, and craft your explanation appropriately. The difficult part is doing so without setting the person up to feel cheated once they shorten the inferential distance a bit.
So, the difficulty isn’t just in making it work better for correct positions (which has its own set of suggestions, like studying statistics and (good) philosophy of science), but also being extremely careful when presenting intermediate stories that aren’t quite right. This latter issue disappears if the other person has close to the same background knowledge as you, and you’re right that in such cases it can become fairly easy to argue for something that is wrong, and even easier to argue for something that isn’t as well settled as you think it is (probably the bigger danger of the two), leading you to misrepresent the strength of your claim. I think this latter issue is much ‘stickier’ and particularly relevant to LW, where you see people who appear to be extremely confident in certain core claims yet appear to have a questionable ability to defend them (often opting to link to posts in the sequences, which is fine if you’ve really taken the time to work out the details, but this isn’t always the case).
Well, I mentioned one of measurements—with regards to how well CO2 absorbs the infrared—as example. The measurements for inputs on basic physics are pretty damn well verified though, and rarely contested. If one is so concerned about such basic stuff, one shouldn’t be using the technology anyway.
It’s the selective trust that is a problem—you trust that plastic smell in car won’t kill you in 15 years but you don’t trust the scientists on warming. Amish global warming denialists aren’t really a problem; the technophilic civilization that doesn’t trust scientists only when they say something uncomfortable, is.
edit:
Anyhow, what you get in AGW debate, is citing of studies that aren’t refuting each other in the slightest; the anti-warming side just cites some low grade stuff like climate measurements, which can at most prove e.g. that sun is dimming, or that we are also doing something which cools the planet (e.g. airplane contrails seed clouds, and the effect is not tiny), but we don’t know what it is. The pro-warming side, though, typically, doesn’t even understand that it got uncontested evidence. Majority of debates, both sides are wrong, one is correct about the fact simply due to luck, but not because the fact has made it, causally, to hold the view.
That’s true, but the model is more complex than “CO2 absorbs infrared, therefore global warming”. It’s closer to something like, “CO2 absorbs infrared, CO2 is produced faster than it is consumed, mitigating factors are insufficient, therefore global warming”; and in reality it’s probably more complex than that. So, it’s not enough to just measure some basic physical properties of CO2; you must also measure its actual concentration in the atmosphere, the rate of change of this concentration, etc.
Here you and I agree.
I think you’re being a bit harsh here. Surely, not all the scientists are just rolling dice in the dark, so to speak ? If scientific consensus was correct primarily “due to luck”, we probably wouldn’t have gotten as far as we did in our understanding of the world...
re: the model, it gets extremely complex when you want to answer a question: does it absolutely positively compel me to change my view about warming.
It doesn’t need to get that complex when you try to maximize expected future utility. We are, actually, pretty good at measuring stuff. And fairly outrageous things need to happen to break that.
I’m not speaking of scientists. I’m speaking of people arguing. Not that there’s all that much wrong with it—after all, the folks who deny the global warming, they have to be convinced somehow, and they are immune to simple reasonable argument WRT the scientific consensus. No, they want to second-guess science, even though they never studied anything relevant outside the climate related discussion.
I’m a tad confused. Earlier you were against people using the information they don’t fully understand yet happens to be true, but here you seem to be suggesting that this isn’t so bad and has a useful purpose—convincing people who deny global warming because they don’t trust the science.
Would you be amenable to the position that sometimes it is OK to purposely direct people to adopt your point of view if it has a certain level of clear support, even if those people leave not fully understanding why the position is correct? I.e. is it sometimes good to promote “guessing the teacher’s password” in the interest of minimizing risk/damages?
Well, I said it was irritating to see, especially if it doesn’t work to convince anyone. If it works, well, the utility of e.g. changing the attitudes can exceed dis-utility of it being annoying. It’s interesting how if one is to try to apply utilitarian reasoning it is immediately interpreted as ‘inconsistent’. May be why we are so bad at it—other’s opinions matter.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
The whole point of studying formal epistemology and debiasing (major topics on this site) is to build the skill of picking out which ideas are more likely to be correct given the evidence. This should always be worked on in the background, and you should only be applying these tips in the context of a sound and consistent epistemology. So really, this problem should fall on the user of these tips—it’s their responsibility to adhere to sound epistemic standards when conveying information.
As far as the issue of changing minds—there is sort of a continuum here, for instance I might have a great deal of strong evidence for something like, say, evolution. Yet there will be people for whom the inferential distance is too great to span in the course of a single discussion—“well, it’s just a theory”, “you can’t prove it” etc.
Relevant to the climate example, a friend of mine who is doing his doctorate in environmental engineering at Yale was speaking to the relative of a friend who is sort of a ‘naive’ climate change denier—he has no grasp of how scientific data works nor does he have any preferred alternative theory he’s invested in. He’s more like the “well it’s cold out now, so how do you explain that?” sort. My friend tried to explain attractors and long term prediction methods, but this was ineffective. Eventually he pointed out how warm the winter has been unusually this year, and that made him think a bit about it. So he exploited the other person’s views to defend his position. However, it didn’t correct the other person’s epistemology at all, and left him with an equally wrong impression of the issue.
The problem with his approach (and really, in his defense, he was just looking to end the conversation) is that should that person learn a bit more about it, he will realize that he was deceived and will remember that the deceiver was a “global warming believer”. In this particular case, that isn’t likely (he almost certainly will not go and study up on climate science), but it illustrates a general danger in presenting a false picture in order to vault inferential distance.
It seems like the key is to first assess the level of inferential distance between you and the other person, and craft your explanation appropriately. The difficult part is doing so without setting the person up to feel cheated once they shorten the inferential distance a bit.
So, the difficulty isn’t just in making it work better for correct positions (which has its own set of suggestions, like studying statistics and (good) philosophy of science), but also being extremely careful when presenting intermediate stories that aren’t quite right. This latter issue disappears if the other person has close to the same background knowledge as you, and you’re right that in such cases it can become fairly easy to argue for something that is wrong, and even easier to argue for something that isn’t as well settled as you think it is (probably the bigger danger of the two), leading you to misrepresent the strength of your claim. I think this latter issue is much ‘stickier’ and particularly relevant to LW, where you see people who appear to be extremely confident in certain core claims yet appear to have a questionable ability to defend them (often opting to link to posts in the sequences, which is fine if you’ve really taken the time to work out the details, but this isn’t always the case).