Well, I said it was irritating to see, especially if it doesn’t work to convince anyone. If it works, well, the utility of e.g. changing the attitudes can exceed dis-utility of it being annoying. It’s interesting how if one is to try to apply utilitarian reasoning it is immediately interpreted as ‘inconsistent’. May be why we are so bad at it—other’s opinions matter.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
The whole point of studying formal epistemology and debiasing (major topics on this site) is to build the skill of picking out which ideas are more likely to be correct given the evidence. This should always be worked on in the background, and you should only be applying these tips in the context of a sound and consistent epistemology. So really, this problem should fall on the user of these tips—it’s their responsibility to adhere to sound epistemic standards when conveying information.
As far as the issue of changing minds—there is sort of a continuum here, for instance I might have a great deal of strong evidence for something like, say, evolution. Yet there will be people for whom the inferential distance is too great to span in the course of a single discussion—“well, it’s just a theory”, “you can’t prove it” etc.
Relevant to the climate example, a friend of mine who is doing his doctorate in environmental engineering at Yale was speaking to the relative of a friend who is sort of a ‘naive’ climate change denier—he has no grasp of how scientific data works nor does he have any preferred alternative theory he’s invested in. He’s more like the “well it’s cold out now, so how do you explain that?” sort. My friend tried to explain attractors and long term prediction methods, but this was ineffective. Eventually he pointed out how warm the winter has been unusually this year, and that made him think a bit about it. So he exploited the other person’s views to defend his position. However, it didn’t correct the other person’s epistemology at all, and left him with an equally wrong impression of the issue.
The problem with his approach (and really, in his defense, he was just looking to end the conversation) is that should that person learn a bit more about it, he will realize that he was deceived and will remember that the deceiver was a “global warming believer”. In this particular case, that isn’t likely (he almost certainly will not go and study up on climate science), but it illustrates a general danger in presenting a false picture in order to vault inferential distance.
It seems like the key is to first assess the level of inferential distance between you and the other person, and craft your explanation appropriately. The difficult part is doing so without setting the person up to feel cheated once they shorten the inferential distance a bit.
So, the difficulty isn’t just in making it work better for correct positions (which has its own set of suggestions, like studying statistics and (good) philosophy of science), but also being extremely careful when presenting intermediate stories that aren’t quite right. This latter issue disappears if the other person has close to the same background knowledge as you, and you’re right that in such cases it can become fairly easy to argue for something that is wrong, and even easier to argue for something that isn’t as well settled as you think it is (probably the bigger danger of the two), leading you to misrepresent the strength of your claim. I think this latter issue is much ‘stickier’ and particularly relevant to LW, where you see people who appear to be extremely confident in certain core claims yet appear to have a questionable ability to defend them (often opting to link to posts in the sequences, which is fine if you’ve really taken the time to work out the details, but this isn’t always the case).
Well, I said it was irritating to see, especially if it doesn’t work to convince anyone. If it works, well, the utility of e.g. changing the attitudes can exceed dis-utility of it being annoying. It’s interesting how if one is to try to apply utilitarian reasoning it is immediately interpreted as ‘inconsistent’. May be why we are so bad at it—other’s opinions matter.
There has however be a mechanism for it to work for correct positions better than for incorrect ones. That is absolutely the key.
The whole point of studying formal epistemology and debiasing (major topics on this site) is to build the skill of picking out which ideas are more likely to be correct given the evidence. This should always be worked on in the background, and you should only be applying these tips in the context of a sound and consistent epistemology. So really, this problem should fall on the user of these tips—it’s their responsibility to adhere to sound epistemic standards when conveying information.
As far as the issue of changing minds—there is sort of a continuum here, for instance I might have a great deal of strong evidence for something like, say, evolution. Yet there will be people for whom the inferential distance is too great to span in the course of a single discussion—“well, it’s just a theory”, “you can’t prove it” etc.
Relevant to the climate example, a friend of mine who is doing his doctorate in environmental engineering at Yale was speaking to the relative of a friend who is sort of a ‘naive’ climate change denier—he has no grasp of how scientific data works nor does he have any preferred alternative theory he’s invested in. He’s more like the “well it’s cold out now, so how do you explain that?” sort. My friend tried to explain attractors and long term prediction methods, but this was ineffective. Eventually he pointed out how warm the winter has been unusually this year, and that made him think a bit about it. So he exploited the other person’s views to defend his position. However, it didn’t correct the other person’s epistemology at all, and left him with an equally wrong impression of the issue.
The problem with his approach (and really, in his defense, he was just looking to end the conversation) is that should that person learn a bit more about it, he will realize that he was deceived and will remember that the deceiver was a “global warming believer”. In this particular case, that isn’t likely (he almost certainly will not go and study up on climate science), but it illustrates a general danger in presenting a false picture in order to vault inferential distance.
It seems like the key is to first assess the level of inferential distance between you and the other person, and craft your explanation appropriately. The difficult part is doing so without setting the person up to feel cheated once they shorten the inferential distance a bit.
So, the difficulty isn’t just in making it work better for correct positions (which has its own set of suggestions, like studying statistics and (good) philosophy of science), but also being extremely careful when presenting intermediate stories that aren’t quite right. This latter issue disappears if the other person has close to the same background knowledge as you, and you’re right that in such cases it can become fairly easy to argue for something that is wrong, and even easier to argue for something that isn’t as well settled as you think it is (probably the bigger danger of the two), leading you to misrepresent the strength of your claim. I think this latter issue is much ‘stickier’ and particularly relevant to LW, where you see people who appear to be extremely confident in certain core claims yet appear to have a questionable ability to defend them (often opting to link to posts in the sequences, which is fine if you’ve really taken the time to work out the details, but this isn’t always the case).