One thing I’ve noticed is that in nearly any controversy where the adherents of the heterodox position show signs of basic mental stability, the arguments for heterodoxy are stronger than the arguments for orthodoxy. In the rare cases where this is not true—for instance, creationism—I can take this as a strong indicator of orthoxy (at least against the particular heresy in question.) but how am I to take the general pattern? Should I be more skeptical of orthodoxy in general—of the likelihood of truth coming to orthodoxy given the standards of public truth evaluation which now prevail—or more trusting of it—given that heterodox positions appear to be stronger regardless of context, and are thus likely stronger for reasons other than their truth? My rough conclusion is that I should either look for me-specific biases in this matter, or else look with greater skepticism of orthodoxy in matters I have not yet investigated and greater trust in orthodoxy in matters I have investigated that the strength of arguments would otherwise lead me to believe. But I haven’t thought this through fully.
One thing I’ve noticed is that in nearly any controversy where the adherents of the heterodox position show signs of basic mental stability, the arguments for heterodoxy are stronger than the arguments for orthodoxy.
Is this true? A priori I could see this go either way, and my personal experiences don’t add much evidence here (I can’t recall many controversies where I’ve probed deeply enough to conclusively weigh orthodoxy against heterodoxy).
A weaker statement I’m more sure of: the arguments for orthodoxy one hears from most people are weaker than the arguments for heterodoxy, because most people have little reason to actually look up whatever factual basis the orthodoxy might have. (I’ve seen someone make this point somewhere on Yvain’s blog but can’t remember who.) For example, I haven’t bothered to look up the precise scientific arguments that’d justify my belief in plate tectonics, but a shrinking earth theorist probably has, if only to launch a counterattack on them. (Corollary: I’d have a good chance of losing an argument with a shrinking earth theorist, even though plate tectonics is, well, true.)
Of course, this means the supporters of orthodoxy are in the worst position to judge when they should be updating their position based on new evidence.
You’ll want to read an earlier Yvain blog post, then, explaining “many reasons to expect that arguments for socially dominant beliefs (which correlate highly with truth) to be worse than the arguments for fringe beliefs (which probably correlate highly with falsehood)”.
Why would you expect the social dominance of a belief to correlate with truth? Except in the most trivial cases, society has no particular mechanism that selects for true beliefs in preference to false ones.
The Darwinian competition of memes selects strongly for those that provide psychological benefits, or are politically useful, or serve the self-interest of large segments of the population. But truth is only relevant if the opponents of a belief can easily and unambiguously disprove it, which is only possible in rare cases.
If the arguments for orthodoxy are stronger, then you dismiss contrarians entirely: they are obviously wrong! So do other people, so you don’t get to hear about them to begin with. And so do most potential contrarians themselves.
So by selection effect, we mostly see contrarian arguments which at least appear to be better than the orthodoxy.
One thing I’ve noticed is that in nearly any controversy where the adherents of the heterodox position show signs of basic mental stability, the arguments for heterodoxy are stronger than the arguments for orthodoxy.
I think it’s a version of Berkson’s paradox: if a position is both heterodox and not supported by any strong arguments, it’s very unlikely that people with “basic mental stability” will embrace it in the first place. See also: “The Majority Is Always Wrong” by EY.
One thing I’ve noticed is that in nearly any controversy where the adherents of the heterodox position show signs of basic mental stability, the arguments for heterodoxy are stronger than the arguments for orthodoxy. In the rare cases where this is not true—for instance, creationism—I can take this as a strong indicator of orthoxy (at least against the particular heresy in question.) but how am I to take the general pattern? Should I be more skeptical of orthodoxy in general—of the likelihood of truth coming to orthodoxy given the standards of public truth evaluation which now prevail—or more trusting of it—given that heterodox positions appear to be stronger regardless of context, and are thus likely stronger for reasons other than their truth? My rough conclusion is that I should either look for me-specific biases in this matter, or else look with greater skepticism of orthodoxy in matters I have not yet investigated and greater trust in orthodoxy in matters I have investigated that the strength of arguments would otherwise lead me to believe. But I haven’t thought this through fully.
Is this true? A priori I could see this go either way, and my personal experiences don’t add much evidence here (I can’t recall many controversies where I’ve probed deeply enough to conclusively weigh orthodoxy against heterodoxy).
A weaker statement I’m more sure of: the arguments for orthodoxy one hears from most people are weaker than the arguments for heterodoxy, because most people have little reason to actually look up whatever factual basis the orthodoxy might have. (I’ve seen someone make this point somewhere on Yvain’s blog but can’t remember who.) For example, I haven’t bothered to look up the precise scientific arguments that’d justify my belief in plate tectonics, but a shrinking earth theorist probably has, if only to launch a counterattack on them. (Corollary: I’d have a good chance of losing an argument with a shrinking earth theorist, even though plate tectonics is, well, true.)
Of course, this means the supporters of orthodoxy are in the worst position to judge when they should be updating their position based on new evidence.
You’ll want to read an earlier Yvain blog post, then, explaining “many reasons to expect that arguments for socially dominant beliefs (which correlate highly with truth) to be worse than the arguments for fringe beliefs (which probably correlate highly with falsehood)”.
Why would you expect the social dominance of a belief to correlate with truth? Except in the most trivial cases, society has no particular mechanism that selects for true beliefs in preference to false ones.
The Darwinian competition of memes selects strongly for those that provide psychological benefits, or are politically useful, or serve the self-interest of large segments of the population. But truth is only relevant if the opponents of a belief can easily and unambiguously disprove it, which is only possible in rare cases.
Or if acting on the damage caused by having a bad model of reality is worse than the signaling benefit of the false belief.
If the arguments for orthodoxy are stronger, then you dismiss contrarians entirely: they are obviously wrong! So do other people, so you don’t get to hear about them to begin with. And so do most potential contrarians themselves.
So by selection effect, we mostly see contrarian arguments which at least appear to be better than the orthodoxy.
See this and this.
I think it’s a version of Berkson’s paradox: if a position is both heterodox and not supported by any strong arguments, it’s very unlikely that people with “basic mental stability” will embrace it in the first place. See also: “The Majority Is Always Wrong” by EY.