If it is indeed the case that, as you suggest, spelling out the truth on these topics requires breaking strong taboos, then there’s a third failure mode, where LessWrongers actually succeed at spelling out the taboo truth, and this causes the site to be pegged as a hate site and lose influence on the cold-button topics that actually matter.
If it’s a choice between 1) don’t talk about these issues and risk forgoing some minor novel insights on a topic that affects most people’s life decisions only very indirectly, 2) talk about these issues in an inoffensive way and risk creating a false consensus of the kind you describe, 3) talk about these issues in an offensive way and risk becoming a hate site (as well as presumably having more blowups), I really would much rather choose 1.
If you’re mistaken and we can be both non-taboo and accurate, then wanting to have the discussion becomes more reasonable. But many people don’t seem to think you’re mistaken, and I don’t understand why these people aren’t helping me root for option 1.
If it’s a choice between 1) don’t talk about these issues and risk forgoing some minor novel insights on a topic that affects most people’s life decisions only very indirectly, 2) talk about these issues in an inoffensive way and risk creating a false consensus of the kind you describe, 3) talk about these issues in an offensive way and risk becoming a hate site (as well as presumably having more blowups), I really would much rather choose 1.
I remember we once had a disagreement about this, but in the meantime I have moved closer to your view.
Basically, the problem is that the idea of a general forum that attempts to apply no-holds-barred rational thinking to all sorts of sundry topics is unworkable. It will either lead to people questioning all kinds of high-status ideological beliefs and purveyors of official truth, thus giving the forum a wacky extremist reputation (and inevitably generating a lot of ugly quarrels in the process) -- or it will converge towards ersatz “rationality” that incorporates all the biases inherent to the contemporary respectable high-status beliefs and institutions as its integral part. What is needed to salvage the situation is a clear statement of what constitutes on-topic discussion, and ruthlessly principled policing of off-topic content no matter what positions it advocates.
But many people don’t seem to think you’re mistaken, and I don’t understand why these people aren’t helping me root for option 1.
Basically, it’s the ersatz rationality failure mode. People simply assume that the principal contemporary high-status beliefs and institutions are, if somewhat imperfect, still based on rational thinking to a sufficient degree that a rational discussion free of delusion and malice simply cannot result in any really terrible conclusions. So I do think most people think I’m mistaken. (Even if they see some validity in my concerns, they presumably believe that I’m exaggerating either the ugliness of reality or the ideological closed-mindedness and intolerance of the respectable opinion.)
I disagree, however, with your characterization of option (1) as “forgoing some minor novel insights on a topic that affects most people’s life decisions only very indirectly.” There is plenty of low-hanging fruit in terms of insight from applying unbiased thinking to issues where the respectable opinion is severely delusional. Also, any topic that is truly important for people’s life decisions, and where accurate knowledge is of high practical value, is highly likely to involve at least some issues where respectable platitudes and effective advice will be very remote from each other, and no-nonsense talk will be against the social norms.
I object to your use of “questioning” here, because it has become ambiguous. I suppose you mean “espousing low-status opinions as the result of questioning”.
ruthlessly principled policing of off-topic content
Notice how and why nothing like this has been necessary for traditional politics. People post political manifestos and are often told both that the content is inappropriate because of its subject and that they have made specific severe errors of thought. I don’t remember a case in which the political poster kept pushing and ultimately only the first response was given, because it isn’t really true, it’s just that if content is political, the outside view is that it is flawed.
the idea of a general forum that attempts to apply no-holds-barred rational thinking to all sorts of sundry topics is unworkable.
The point of the forum is to develop thinking techniques that are useful because they can be widely applicable. Apolitical examples are part of the training, but eventually one only cares about applying the system of thought when it reaches correct conclusions that otherwise would not have been reached, and it will inevitably deviate from what other systems would conclude.
Allow me to float an idea: post a disclaimer on the site that as a test and to prevent cultishness, one (or perhaps a few) deceptively wrong idea (wrong as unanimously agreed upon by a number of demonstrably masterful people) is advocated as if it were the mainstream opinion here, and aspiring rationalists are expected to reach the unpopular (here) opinion. The masters—most,but not all of them—argue for the popular (here) opinion that is low-status in society. Anyone who objects that an aspect of the site has a plurality of evilly inclined and majority of wrongly thinking people on a topic (say, PUA) can be told that that subject is suspected to be the (or one of the) ones on which the best thinkers not only disagree with the local majority opinion, but do so unanimously.
It goes without saying that...well, it really does go without saying, so I won’t say it.
College physics professor gives a weekly lecture. Toward the end of the first day, a student in the first row points out an elementary mistake in one of the equations. Prof congratulates the student, announces that every day there will be an error in the lecture. The midterm and final exams will consist of a list of lecture dates, and the only way to pass a given question is to point out the error in the corresponding day’s lecture.
Prof gets into progressively more complex subjects. Everybody takes good notes. After the final, that student from the front row visits the prof’s office, apologetically explains that nobody could figure out the mistake in the last lecture. Prof says “That’s alright, I can’t either.”
Basically, the problem is that the idea of a general forum that attempts to apply no-holds-barred rational thinking to all sorts of sundry topics is unworkable.
What’s scarier, the idea of a conceptual apparatus that attempts to apply no-holds-barred rational thinking to all sorts of sundry topics may to an extent be unworkable. If the deniers of high-status-falsehood-1 all started using some catchy phrase (of the sort that LW has lots of), and then the deniers of high-status-falsehood-2 started using that phrase too, both would start smelling like the other and seem crazier for it. (This is one of the considerations that make me not want to try getting around these restrictions with pseudonyms.) On the other hand, of course, there are a number of concepts to fall back on that basically can’t be corrupted because they’re used all the time by e.g. probability theorists obviously lacking any agenda.
I disagree, however, with your characterization of option (1) as “forgoing some minor novel insights on a topic that affects most people’s life decisions only very indirectly.”
When I said that, I was thinking of the “do women like nice guys or jerks” question specifically. I wouldn’t say politically-charged topics hardly affect people’s lives as a blanket statement, though I think it’s true in a great many cases. But your reading was the more natural one and I apologize for being unclear.
There is plenty of low-hanging fruit in terms of insight from applying unbiased thinking to issues where the respectable opinion is severely delusional.
It’s really hard to actually know when the “respectable” opinion is severely delusional… and even if the consensus view is indeed totally wrong, most minority opinions are usually even wronger than that. Saying the Sun orbits the Earth is much less crazy that saying that the Sun orbits the Moon half the time and Mars the other half of the time.
It’s really hard to actually know when the “respectable” opinion is severely delusional…
I disagree. Of course, it’s hard to know this with consistent reliability across the board, but there are plenty of particular cases where this is perfectly clear. Many of these cases don’t even involve topics that are ideologically charged to such extremes that contrarian conclusions would be outright scandalous. (Though of course the purveyors of the respectable opinion and the officially accredited truth wouldn’t be pleased, and certainly wouldn’t be willing to accept the contrarian discourse as legitimate.)
To give a concrete example, it is clear that, say, mainstream economics falls into this latter category.
Just watch out that when you say “The experts on X are wrong; don’t believe them” that you aren’t telling people to sell nonapples. “Don’t believe in YHVH” doesn’t mean that you should go believe in Zenu.
I don’t mean rejecting the mainstream view in favor of some existing contrarian position—of which the majority are indeed unavoidably wrong, no matter what the merits of the mainstream view—but merely applying the very basic tools of common sense and rational thinking to see if the justification for the mainstream view can stand up to scrutiny. My point is that often the mainstream view fails as soon as it’s checked against the elementary laws of logic and the most basic and uncontroversial principles of sound epistemology. It really isn’t hard.
Trouble is, blow-ups are in fact the less bad failure mode in discussions of this sort. A much less bad one.
If it is indeed the case that, as you suggest, spelling out the truth on these topics requires breaking strong taboos, then there’s a third failure mode, where LessWrongers actually succeed at spelling out the taboo truth, and this causes the site to be pegged as a hate site and lose influence on the cold-button topics that actually matter.
If it’s a choice between 1) don’t talk about these issues and risk forgoing some minor novel insights on a topic that affects most people’s life decisions only very indirectly, 2) talk about these issues in an inoffensive way and risk creating a false consensus of the kind you describe, 3) talk about these issues in an offensive way and risk becoming a hate site (as well as presumably having more blowups), I really would much rather choose 1.
If you’re mistaken and we can be both non-taboo and accurate, then wanting to have the discussion becomes more reasonable. But many people don’t seem to think you’re mistaken, and I don’t understand why these people aren’t helping me root for option 1.
I remember we once had a disagreement about this, but in the meantime I have moved closer to your view.
Basically, the problem is that the idea of a general forum that attempts to apply no-holds-barred rational thinking to all sorts of sundry topics is unworkable. It will either lead to people questioning all kinds of high-status ideological beliefs and purveyors of official truth, thus giving the forum a wacky extremist reputation (and inevitably generating a lot of ugly quarrels in the process) -- or it will converge towards ersatz “rationality” that incorporates all the biases inherent to the contemporary respectable high-status beliefs and institutions as its integral part. What is needed to salvage the situation is a clear statement of what constitutes on-topic discussion, and ruthlessly principled policing of off-topic content no matter what positions it advocates.
Basically, it’s the ersatz rationality failure mode. People simply assume that the principal contemporary high-status beliefs and institutions are, if somewhat imperfect, still based on rational thinking to a sufficient degree that a rational discussion free of delusion and malice simply cannot result in any really terrible conclusions. So I do think most people think I’m mistaken. (Even if they see some validity in my concerns, they presumably believe that I’m exaggerating either the ugliness of reality or the ideological closed-mindedness and intolerance of the respectable opinion.)
I disagree, however, with your characterization of option (1) as “forgoing some minor novel insights on a topic that affects most people’s life decisions only very indirectly.” There is plenty of low-hanging fruit in terms of insight from applying unbiased thinking to issues where the respectable opinion is severely delusional. Also, any topic that is truly important for people’s life decisions, and where accurate knowledge is of high practical value, is highly likely to involve at least some issues where respectable platitudes and effective advice will be very remote from each other, and no-nonsense talk will be against the social norms.
I object to your use of “questioning” here, because it has become ambiguous. I suppose you mean “espousing low-status opinions as the result of questioning”.
Notice how and why nothing like this has been necessary for traditional politics. People post political manifestos and are often told both that the content is inappropriate because of its subject and that they have made specific severe errors of thought. I don’t remember a case in which the political poster kept pushing and ultimately only the first response was given, because it isn’t really true, it’s just that if content is political, the outside view is that it is flawed.
The point of the forum is to develop thinking techniques that are useful because they can be widely applicable. Apolitical examples are part of the training, but eventually one only cares about applying the system of thought when it reaches correct conclusions that otherwise would not have been reached, and it will inevitably deviate from what other systems would conclude.
Allow me to float an idea: post a disclaimer on the site that as a test and to prevent cultishness, one (or perhaps a few) deceptively wrong idea (wrong as unanimously agreed upon by a number of demonstrably masterful people) is advocated as if it were the mainstream opinion here, and aspiring rationalists are expected to reach the unpopular (here) opinion. The masters—most,but not all of them—argue for the popular (here) opinion that is low-status in society. Anyone who objects that an aspect of the site has a plurality of evilly inclined and majority of wrongly thinking people on a topic (say, PUA) can be told that that subject is suspected to be the (or one of the) ones on which the best thinkers not only disagree with the local majority opinion, but do so unanimously.
It goes without saying that...well, it really does go without saying, so I won’t say it.
College physics professor gives a weekly lecture. Toward the end of the first day, a student in the first row points out an elementary mistake in one of the equations. Prof congratulates the student, announces that every day there will be an error in the lecture. The midterm and final exams will consist of a list of lecture dates, and the only way to pass a given question is to point out the error in the corresponding day’s lecture.
Prof gets into progressively more complex subjects. Everybody takes good notes. After the final, that student from the front row visits the prof’s office, apologetically explains that nobody could figure out the mistake in the last lecture. Prof says “That’s alright, I can’t either.”
What’s scarier, the idea of a conceptual apparatus that attempts to apply no-holds-barred rational thinking to all sorts of sundry topics may to an extent be unworkable. If the deniers of high-status-falsehood-1 all started using some catchy phrase (of the sort that LW has lots of), and then the deniers of high-status-falsehood-2 started using that phrase too, both would start smelling like the other and seem crazier for it. (This is one of the considerations that make me not want to try getting around these restrictions with pseudonyms.) On the other hand, of course, there are a number of concepts to fall back on that basically can’t be corrupted because they’re used all the time by e.g. probability theorists obviously lacking any agenda.
When I said that, I was thinking of the “do women like nice guys or jerks” question specifically. I wouldn’t say politically-charged topics hardly affect people’s lives as a blanket statement, though I think it’s true in a great many cases. But your reading was the more natural one and I apologize for being unclear.
It’s really hard to actually know when the “respectable” opinion is severely delusional… and even if the consensus view is indeed totally wrong, most minority opinions are usually even wronger than that. Saying the Sun orbits the Earth is much less crazy that saying that the Sun orbits the Moon half the time and Mars the other half of the time.
See also.
I disagree. Of course, it’s hard to know this with consistent reliability across the board, but there are plenty of particular cases where this is perfectly clear. Many of these cases don’t even involve topics that are ideologically charged to such extremes that contrarian conclusions would be outright scandalous. (Though of course the purveyors of the respectable opinion and the officially accredited truth wouldn’t be pleased, and certainly wouldn’t be willing to accept the contrarian discourse as legitimate.)
To give a concrete example, it is clear that, say, mainstream economics falls into this latter category.
Just watch out that when you say “The experts on X are wrong; don’t believe them” that you aren’t telling people to sell nonapples. “Don’t believe in YHVH” doesn’t mean that you should go believe in Zenu.
I don’t mean rejecting the mainstream view in favor of some existing contrarian position—of which the majority are indeed unavoidably wrong, no matter what the merits of the mainstream view—but merely applying the very basic tools of common sense and rational thinking to see if the justification for the mainstream view can stand up to scrutiny. My point is that often the mainstream view fails as soon as it’s checked against the elementary laws of logic and the most basic and uncontroversial principles of sound epistemology. It really isn’t hard.