I think a lot of the people that fall into this camp (at least those that I know of) are people that have just recently deconverted—they’ve just been through a major life-change involving religion and therefore are understandably entranced with the whole process as it is particularly meaningful to them.
Alternatively, they are reacting against some heavy prejudice that they have had to suffer through—or have some loved ones that are particularly “afflicted” and want to see something done to prevent it happening to others.
Sure, there are other big, important things out there… but one man’s meat is another’s poison, and all that.
I think it’s easy enough to say that there are bigger problems out there… when we are looking at it from the perspective of having been atheist for a long time. but some people have just had their world cave in—everything has been upturned. They no longer have that huge big safety net underneath them that tells them that everything is going to be alright in the afterlife. Maybe they’ve just discovered that they’ve been wasting one seventh of their life in church when they could have been out there exploring his beautiful world that we live in or spending quality time with their kids… it may seem like nothing important to you, but it’s a Big Thing to some people.
PS—I am also inclined to agree with you that there are better things the time could be spent on… but that’s “better from my perspective” and it’s not mine that counts.
Well, whenever I open this topic, giving concrete examples is problematic, since these are by definition respectable and high-status delusions, so it’s difficult or impossible to contradict them without sounding like a crackpot or extremist.
There are however a few topics where prominent LW participants have run into such instances of respectable opinion being dogmatic and immune to rational argument. On example is the already mentioned neglect of technology-related existential risks—as well as other non-existential but still scary threats that might be opened due to the upcoming advances in technology—and the tendency to dismiss people who ask such questions as crackpots. Another is the academic and medical establishment’s official party line against cryonics, which is completely impervious to any argument. (I have no interest in cryonics myself, but the dogmatic character of the official line is clear, as well as its lack of solid foundation.)
This, however, is just the tip of the iceberg. Unfortunately, listing other examples typically means opening ideologically charged topics that are probably best left alone. One example that shouldn’t be too controversial is economics. We have people in power to regulate and manage things, with enough power and influence to wreak havoc if they don’t know what they’re doing, whose supposed expertise however appears, on independent examination, to consist mostly of cargo-cult science and ideological delusions, even though they bear the most prestigious official titles and accreditations. Just this particular observation should be enough to justify my Titanic allegory.
whenever I open this topic, giving concrete examples is problematic, since these are by definition respectable and high-status delusions, so it’s difficult or impossible to contradict them without sounding like a crackpot or extremist.
On the other hand elsewhere you write
There are many other questions where the prevailing views of academic experts, intellectuals, and other high-status shapers of public opinion, are, in my opinion, completely delusional. Some of these are just theoretical questions without much practical bearing on anything, but others have real ugly consequences on a large scale, up to and including mass death and destruction, or seriously threaten such consequences in the future. Many of them also make the world more ugly and dysfunctional, and life more burdensome and joyless, in countless little ways; others are presented as enlightened wisdom on how to live your life but are in fact a recipe for disaster for most people who might believe and try to apply them.
which suggests that you think that the things that you’re avoiding writing about are very important. If they’re so important then why not pay the price of being considered a crackpot/extremist by some in order to fight against the delusional views? Is the key issue self-preservation of the type that you mentioned in response to Komponisto?
Also, arguing on the internet under one’s real identity is a bad idea for anyone who isn’t in one of these four categories...
Or is the point that you think that there’s not much hope for changing people’s views on the questions that you have in mind so that it’s futile to try?
Well, there are several reasons why I’m not incessantly shouting all my contrarian views from the rooftops.
For start, yes, obviously I am concerned with the possible reputational consequences. But even ignoring that, the problem is that arguing for contrarian views may well have the effect of making them even more disreputable and strengthening the mainstream consensus, if it’s done in a way that signals low status, eccentricity, immorality, etc., or otherwise enables the mainstream advocates to score a rhetorical victory in the ensuing debate (regardless of the substance of the arguments). Thus, even judging purely by how much you’re likely to move people’s opinions closer or further from the truth, you should avoid arguing for contrarian views unless the situation seems especially favorable, in the sense that you’ll be able to present your case competently and in front of a suitable audience.
Moreover, there is always the problem of whether you can trust your own contrarian opinions. After all, even if you take the least favorable view of the respectable opinion and the academic mainstream, it is still the case that most contrarians are deluded in even crazier ways. So how do you know that you haven’t in fact become a crackpot yourself? This is why rather than making a piecemeal catalog of delusional mainstream views, I would prefer to have a more general framework for estimating how reliable the mainstream opinion is likely to be on a particular subject given various factors and circumstances, and what general social, economic, political, and other mechanisms have practical influence in this regard. Effort spent on obtaining such insight is, in my opinion, far more useful than attacking seemingly wrong mainstream opinions one by one.
These latter questions should, in my opinion, be very high (if not on the top) of the list of priorities of people who are concerned with overcoming bias and increasing their rationality and the accuracy of their beliefs, and one of my major disappointments with LW is that attempts to open discussion about these matters invariably fall flat. (This despite the fact that such discussions could be productive even without opening any especially dangerous and charged topics, and despite the fact that on LW one regularly hears frustrated accounts of the mainstream being impervious to argument on topics such as existential risk or cryonics. I find it especially puzzling that smart people who are concerned about the latter have no interest in investigating the underlying more general and systematic problems.)
Thus, even judging purely by how much you’re likely to move people’s opinions closer or further from the truth, you should avoid arguing for contrarian views unless the situation seems especially favorable, in the sense that you’ll be able to present your case competently and in front of a suitable audience.
Doesn’t this push in the direction of holding contrarian views being useless except as a personal hobby? If so, why argue against mainstream delusional views at all (even as a collection without specifying what they are)? Is the point of your comment that you think it’s possible to make progress by highlighting broad phenomena about the reliability of mainstream views so that people can work out the implications on their own without there being a need for explicit public discussion?
Moreover, there is always the problem of whether you can trust your own contrarian opinions. After all, even if you take the least favorable view of the respectable opinion and the academic mainstream, it is still the case that most contrarians are deluded in even crazier ways. So how do you know that you haven’t in fact become a crackpot yourself?
A natural method to avoid becoming a crackpot is to reveal one’s views for possible critique in a gradual and carefully argued fashion, adjusting them as people point out weaknesses. Of course it might not be a good idea to reveal one’s views regardless (self-preservation; opportunity cost of time) but I don’t think that danger of being a crackpot is a good reason.
These latter questions should, in my opinion, be very high (if not on the top) of the list of priorities of people who are concerned with overcoming bias and increasing their rationality and the accuracy of their beliefs, and one of my major disappointments with LW is that attempts to open discussion about these matters invariably fall flat.
Is the point of your comment that you think it’s possible to make progress by highlighting broad phenomena about the reliability of mainstream views so that people can work out the implications on their own without there being a need for explicit public discussion?
Basically, I believe that exploring the general questions about how mainstream views are generated in practice and what are the implications for their reliability is by far the most fruitful direction for people interested in increasing the accuracy of their beliefs across the board. Of course, if you have a particular interest in some question, you have to grapple with the concrete issues involved, and also a general exploration must be based on concrete case studies. But attacking particular mainstream views head-on may well be counterproductive in every sense, as I noted above.
A natural method to avoid becoming a crackpot is to reveal one’s views for possible critique in a gradual and carefully argued fashion, adjusting them as people point out weaknesses. Of course it might not be a good idea to reveal one’s views regardless (self-preservation; opportunity cost of time) but I don’t think that danger of being a crackpot is a good reason.
That’s assuming you have discussion partners who are knowledgeable, open-minded, and patient enough. However, such people are the most difficult to find exactly in those areas where you’re faced with the Scylla of a deeply flawed mainstream and the Charybdis of even worse crackpot contrarians.
(Please also see my reply to Nick Tarleton, who asked a similar question as the rest of your comment.)
Basically, I believe that exploring the general questions about how mainstream views are generated in practice and what are the implications for their reliability is by far the most fruitful direction for people interested in increasing the accuracy of their beliefs across the board. Of course, if you have a particular interest in some question, you have to grapple with the concrete issues involved, and also a general exploration must be based on concrete case studies. But attacking particular mainstream views head-on may well be counterproductive in every sense, as I noted above.
This is fair; you’ve made your position clear, thanks.
That’s assuming you have discussion partners who are knowledgeable, open-minded, and patient enough. However, such people are the most difficult to find exactly in those areas where you’re faced with the Scylla of a deeply flawed mainstream and the Charybdis of even worse crackpot contrarians.
Agree in general. How about Less Wrong in particular?
Agree in general. How about Less Wrong in particular?
Well, LW is great for discussing a concrete problem if you manage to elicit some interest in it, both because of people’s high general intellectual skills and because of low propensity to emotionally driven reactions that are apt to derail the discussion, even in fairly charged topics (well, except for gender-related ones, I guess). So, yes, LW is very good for this sort of reality-checking if you manage to find people interested in your topic.
You can take any topic where it’s impossible to make sense of the existing academic literature (and other influential high-status sources), or where the respectable mainstream consensus seems to clash with reality. When discussions about such topics are opened on LW, often the logical next step would be to ask about the more general underlying problems that give rise to these situations, instead of just focusing on the arguments about particular problems in isolation. (And even without a concrete motivation, such questions should directly follow from LW’s mission statement.) Yet I see few, if any attempts to ask such general questions on LW, and my occasional attempts to open discussion along these lines, even when highly upvoted, don’t elicit much in terms of interesting arguments and insight.
As an illustration, we can take an innocent and mainstream problematic topic like e.g. the health questions of lifestyle such as nutrition, exercise, etc. These topics have been discussed on LW many times, and it seems evident that the mainstream academic literature is a complete mess, with potential gems of useful insight buried under mountains of nonsense work, and authoritative statements of expert opinion given without proper justification. Yet I see no attempt to ask a straightforward follow-up question: since these areas operate under the official bureaucratic system that’s supposed to be guaranteed to produce valid science, then what exactly went wrong? And what implications does it have for other areas where we take the official output of this same bureaucratic system as ironclad evidence?
Of course, when it comes to topics that are more dangerous and ideologically charged, the underlying problems are likely to be different and more severe. One can reasonably argue that such topics are best avoided on a forum like LW, both because they’re likely to stir up bad blood and because of the potential bad signaling and reputational consequences for the forum as an institution. But even if we take the most restrictive attitude towards such topics, there are still many others that can be used as case studies for gaining insight about the systematic underlying problems.
When discussions about such topics are opened on LW, often the logical next step would be to ask about the more general underlying problems that give rise to these situations, instead of just focusing on the arguments about particular problems in isolation. (And even without a concrete motivation, such questions should directly follow from LW’s mission statement.) Yet I see few, if any attempts to ask such general questions on LW, and my occasional attempts to open discussion along these lines, even when highly upvoted, don’t elicit much in terms of interesting arguments and insight.
Your own points have struck me as on the mark; but I haven’t had much to add.
There are some interesting general comments that I could make based on my experience in the mathematical community in particular. I guess here I have some tendency toward self-preservation myself; I don’t want to offend acquaintances who might be cast in negative life by my analysis. (Would be happy to share my views privately if you’re interested though.) I guess my attitude here is that there’s little upside to making my remarks public. The behaviors that I perceive to be dysfunctional are sufficiently deeply entrenched so that whatever I would say would have little expected value.
The main upside would be helping others attain intellectual enlightenment, but although I myself greatly enjoy the satisfaction of being intellectual enlightenment, I’m not sure that intellectual enlightenment is very valuable from a global perspective. Being right is of little use without being influential. In general the percentage of people who are right (or interested in being right) on a given topic where a contrarian position is right is sufficiently small so that the critical mass that it would take to change things isn’t there and nor would an incremental change in this percentage make a difference.
The reason why the above point has so much weight in my mind is that despite my very high interest in learning about a variety of things and in forming accurate views on a variety of subjects; I haven’t achieved very much. It’s not clear whether having accurate views of the world has been more helpful or harmful to me in achieving my goals. The jury is still very much out and things may change; but the very fact that it’s possible for me to have this attitude is a strong indication that knowledge and accurate views on a variety of things can be useless on their own.
The best cure against such prideful attitudes is to ask yourself what you have to show in terms of practical accomplishments and status if you’re so much more rational and intellectually advanced than ordinary people. If they are so stupid and delusional to be deserving of such intolerance and contempt, then an enlightened and intellectually superior person should be able to run circles around them and easily come out on top, no?
Regarding:
As an illustration, we can take an innocent and mainstream problematic topic like e.g. the health questions of lifestyle such as nutrition, exercise, etc. These topics have been discussed on LW many times, and it seems evident that the mainstream academic literature is a complete mess, with potential gems of useful insight buried under mountains of nonsense work, and authoritative statements of expert opinion given without proper justification. Yet I see no attempt to ask a straightforward follow-up question: since these areas operate under the official bureaucratic system that’s supposed to be guaranteed to produce valid science, then what exactly went wrong? And what implications does it have for other areas where we take the official output of this same bureaucratic system as ironclad evidence?
I made a comment that you may find relevant here; I would characterize nutrition/exercise/etc. as fields that are obviously important and which therefore attract many researchers/corporations/hobbyists/etc. having the effect of driving high quality of researchers out of the field on account of bad associations.
Another factor may be absence of low hanging fruit (which you reference in your top level post); it could be that the diversity of humans is sufficiently great so that it’s difficult to make general statements about what’s healthy/unhealthy.
I agree with what you said about main stream fields being diluted, but offer an interesting corollary to that. Economic motives compel various gurus and nutritionists to make claims to the average joe, and the average joe, or even the educated joe cannot sort through them. However, if one looks in more narrow fields, one can obtain more specific answers without so much trash. For example, powerlifting. This is not a huge market nor one you can benefit financially from that much. If one is trying to sell something or get something published, he can’t just say “I pretty much agree with X”, he needs to somehow distinguish himself. But when that motive is eliminated you can get more consistency in recommendations and have a greater chance to actually hit upon what works.
While you might not be interested in powerlifting, reading in more niche areas can help filter out profit/status seeking charlatans, and can allow one to see the similarities across disciplines. So while I’ve read about bodybuilding, powerlifting, and endurance sports, and their associated nutritional advice, I would never read a book about “being fit.”
As an aside, I recently had this horrible moment of realization. Much of the fitness advise given out is just so incredibly wrong, and I am able to realize that because I have a strong background in that subject. But I realized, 90% of the stuff I read about are areas I don’t have a great background in. I could be accepting really wrong facts in other areas that are just as wrong as the nutritional facts I scoff at, and I would never learn of my error.
One example is the already mentioned neglect of technology-related existential risks—as well as other non-existential but still scary threats that might be opened due to the upcoming advances in technology—and the tendency to dismiss people who ask such questions as crackpots.
That does seem to be a useful heuristic. DOOM mongers are usually selling something. They typically make exaggerated and biased claims. The SIAI and FHI do not seem to be significant exceptions to this—though their attempts to be scientific and rational certainly help.
These types of organisation form naturaly from those with the largest p(DOOM) estimates. That is not necessarily the best way to obtain an unbiased estimate. If you run into organistions who are trying to convince you that the end of the world is nigh—and you should donate to help them save it—you should at least be aware that this pattern is an ancient one with a dubious pedigree.
Another is the academic and medical establishment’s official party line against cryonics, which is completely impervious to any argument. (I have no interest in cryonics myself, but the dogmatic character of the official line is clear, as well as its lack of solid foundation.)
I am inclined to ask for references. As far as I understand it there is a real science Cryogenics—which goes out of its way to distance itself from its more questionable cousin (cryonics) - which has a confusingly-similar name. Much as psychology tries to distinguish itself from psychiatry. Is there much more than that going on here?
From what I understand, the professional learned society of cryobiologists has an official policy that bans any engagement with cryonics to its members under the pain of expulsion (which penalty would presumably have disastrous career implications). Therefore, cryobiologists are officially mandated to uphold this party line and condemn cryonics, if they are to speak on the subject at all. From what I’ve seen, cryonics people have repeatedly challenged this position with reasonable arguments, but they haven’t received anything like satisfactory rebuttals that would justify the official position. (See more details in this post, whose author has spent considerable effort searching for such rebuttal.)
Now, for all I know, it may well be that the claims of cryonicists are complete bunk after all. The important point is that here we see a clear and unambiguous instance of the official academic mainstream upholding an official line that is impervious to rational argument, and attempts to challenge this official line elicit sneering and stonewalling rather than any valid response. One of my claims in this discussion is that this is far from being the only such example (although the official positions and the condemnations of dissenters are rarely spelled out so explicitly), and LW people familiar with this example should take it as a significant piece of evidence against trusting the academic mainstream consensus in general.
From what I understand, the professional learned society of cryobiologists has an official policy that bans any engagement with cryonics to its members under the pain of expulsion (which penalty would presumably have disastrous career implications). Therefore, cryobiologists are officially mandated to uphold this party line and condemn cryonics, if they are to speak on the subject at all.
Upon a two-thirds vote of the Governors in office, the Board of Governors may refuse membership to applicants, or suspend or expel members (including both individual and institutional members), whose conduct is deemed detrimental to the Society, including applicants or members engaged in or who promote any practice or application which the Board of Governors deems incompatible with the ethical and scientific standards of the Society or as misrepresenting the science of cryobiology, including any practice or application of freezing deceased persons in anticipation of their reanimation. [Sec. 2.04 of the bylaws of the Society for Cryobiology].
It says they are not allowed to perform or promote freezing of “deceased persons”—citing concerns over ethical and scientific standards—and its own reputation. They probably want to avoid them and their members being associated with cryonics scandals and lawsuits.
As I said, I don’t have a dog in this particular fight, and for all I know, the cryobiologists’ rejection of cryonics might in fact be justified, for both reasons of science and pragmatist political considerations. However, the important point is that if you ask, in a polite, reasonable, and upfront manner, for a scientific assessment of cryonics and what exactly are the problems with it, it is not possible to get a full, honest, and scientifically sound answer, as demonstrated by that article to which I linked above. Contrast this with what happens if you ask, say, physicists what is wrong with some crackpot theory of physics—they will spell out a detailed argument showing what exactly is wrong, and they will be able to answer any further questions you might have and fully clarify any confusion, as long as you’re not being impervious to argument.
Regardless of any particular concern about cryonics, the conclusion to draw from this is that a strong mainstream academic consensus sometimes rests on a rock-solid foundation that can be readily examined if you just invest some effort, but sometimes this is not the case, at the very least because for some questions there is no way to even get a clear and detailed statement on what exactly this foundation is supposed to be. From this, it is reasonable to conclude that mainstream academic consensus should not be taken as conclusive evidence for anything—and in turn, contrarian opinions should not be automatically discarded just because mainstream academics reject them -- unless you have some reliable criteria for evaluating how solid its foundation is in a particular area. The case of cryonics is relevant for my argument only insofar as this is a question where lots of LW people have run into a strong mainstream consensus for which it’s impossible to get a solid justification, thus providing one concrete example that shouldn’t be too controversial here.
However, the important point is that if you ask, in a polite, reasonable, and upfront manner, for a scientific assessment of cryonics and what exactly are the problems with it, it is not possible to get a full, honest, and scientifically sound answer, as demonstrated by that article to which I linked above. Contrast this with what happens if you ask, say, physicists what is wrong with some crackpot theory of physics—they will spell out a detailed argument showing what exactly is wrong, and they will be able to answer any further questions you might have and fully clarify any confusion, as long as you’re not being impervious to argument.
I think most parties involved agree that cryonic revival is a bit of long shot. It is hard to say exactly how much of a long shot it is—since that depends on speculative far-future things like whether an advanced civilization will be sufficiently interested enough in us to revive us. Scientists can’t say too much about that—except that there are quite a few unknown unknowns—and so to have wide confidence intervals.
This comment seems to be influenced by an association fallacy.
The fact that someone has suffered doesn’t imply that they are rational or irrational.
If you acknowledge evidence that someone is being irrational it doesn’t mean you have to deny they have any positive qualities or be unsympathetic about their problems.
I made no claim that they were rational—or that they had no nice qualities.
I was trying to point out that “I think that other people should not spend time on X because it is an unimportant topic to me” is failing to understand that to these people, X is not an unimportant topic.
Some examples would agitate a thundering herd of heretic hunters and witch finders who would vote down every post that Vladimir has ever made or will ever make.
This dosen’t seem overwhelmingly likley false. Its not a very nice things to say, but why should that matter?
Why so many down votes? In any case since I’m Charlie Sheen I don’t care if I’m down voted since I’m WINNING no matter what they say (wow memetic brainfreeze).
It does to me. Less Wrong isn’t so popular that I’d expect a herd of people to bother downvoting each of Vladimir_M’s dozens of posts, then wait around for him to make more posts to downvote, just because of a few examples. The fact that Vladimir_M gave two or three examples but still has non-negative scores for his recent comments is more evidence that sam0345 was wrong.
Its not a very nice things to say, but why should that matter?
Quite a few LW users see niceness as a useful norm and may have voted accordingly. At any rate, I don’t think it was a lack of niceness that provoked people to downvote that comment; I’d guess it was because they read it as an unhelpful overstatement.
OOC - some examples would be nice :)
I think a lot of the people that fall into this camp (at least those that I know of) are people that have just recently deconverted—they’ve just been through a major life-change involving religion and therefore are understandably entranced with the whole process as it is particularly meaningful to them.
Alternatively, they are reacting against some heavy prejudice that they have had to suffer through—or have some loved ones that are particularly “afflicted” and want to see something done to prevent it happening to others.
Sure, there are other big, important things out there… but one man’s meat is another’s poison, and all that.
I think it’s easy enough to say that there are bigger problems out there… when we are looking at it from the perspective of having been atheist for a long time. but some people have just had their world cave in—everything has been upturned. They no longer have that huge big safety net underneath them that tells them that everything is going to be alright in the afterlife. Maybe they’ve just discovered that they’ve been wasting one seventh of their life in church when they could have been out there exploring his beautiful world that we live in or spending quality time with their kids… it may seem like nothing important to you, but it’s a Big Thing to some people.
PS—I am also inclined to agree with you that there are better things the time could be spent on… but that’s “better from my perspective” and it’s not mine that counts.
Well, whenever I open this topic, giving concrete examples is problematic, since these are by definition respectable and high-status delusions, so it’s difficult or impossible to contradict them without sounding like a crackpot or extremist.
There are however a few topics where prominent LW participants have run into such instances of respectable opinion being dogmatic and immune to rational argument. On example is the already mentioned neglect of technology-related existential risks—as well as other non-existential but still scary threats that might be opened due to the upcoming advances in technology—and the tendency to dismiss people who ask such questions as crackpots. Another is the academic and medical establishment’s official party line against cryonics, which is completely impervious to any argument. (I have no interest in cryonics myself, but the dogmatic character of the official line is clear, as well as its lack of solid foundation.)
This, however, is just the tip of the iceberg. Unfortunately, listing other examples typically means opening ideologically charged topics that are probably best left alone. One example that shouldn’t be too controversial is economics. We have people in power to regulate and manage things, with enough power and influence to wreak havoc if they don’t know what they’re doing, whose supposed expertise however appears, on independent examination, to consist mostly of cargo-cult science and ideological delusions, even though they bear the most prestigious official titles and accreditations. Just this particular observation should be enough to justify my Titanic allegory.
Why ever not?
On the other hand elsewhere you write
which suggests that you think that the things that you’re avoiding writing about are very important. If they’re so important then why not pay the price of being considered a crackpot/extremist by some in order to fight against the delusional views? Is the key issue self-preservation of the type that you mentioned in response to Komponisto?
Or is the point that you think that there’s not much hope for changing people’s views on the questions that you have in mind so that it’s futile to try?
Well, there are several reasons why I’m not incessantly shouting all my contrarian views from the rooftops.
For start, yes, obviously I am concerned with the possible reputational consequences. But even ignoring that, the problem is that arguing for contrarian views may well have the effect of making them even more disreputable and strengthening the mainstream consensus, if it’s done in a way that signals low status, eccentricity, immorality, etc., or otherwise enables the mainstream advocates to score a rhetorical victory in the ensuing debate (regardless of the substance of the arguments). Thus, even judging purely by how much you’re likely to move people’s opinions closer or further from the truth, you should avoid arguing for contrarian views unless the situation seems especially favorable, in the sense that you’ll be able to present your case competently and in front of a suitable audience.
Moreover, there is always the problem of whether you can trust your own contrarian opinions. After all, even if you take the least favorable view of the respectable opinion and the academic mainstream, it is still the case that most contrarians are deluded in even crazier ways. So how do you know that you haven’t in fact become a crackpot yourself? This is why rather than making a piecemeal catalog of delusional mainstream views, I would prefer to have a more general framework for estimating how reliable the mainstream opinion is likely to be on a particular subject given various factors and circumstances, and what general social, economic, political, and other mechanisms have practical influence in this regard. Effort spent on obtaining such insight is, in my opinion, far more useful than attacking seemingly wrong mainstream opinions one by one.
These latter questions should, in my opinion, be very high (if not on the top) of the list of priorities of people who are concerned with overcoming bias and increasing their rationality and the accuracy of their beliefs, and one of my major disappointments with LW is that attempts to open discussion about these matters invariably fall flat. (This despite the fact that such discussions could be productive even without opening any especially dangerous and charged topics, and despite the fact that on LW one regularly hears frustrated accounts of the mainstream being impervious to argument on topics such as existential risk or cryonics. I find it especially puzzling that smart people who are concerned about the latter have no interest in investigating the underlying more general and systematic problems.)
Doesn’t this push in the direction of holding contrarian views being useless except as a personal hobby? If so, why argue against mainstream delusional views at all (even as a collection without specifying what they are)? Is the point of your comment that you think it’s possible to make progress by highlighting broad phenomena about the reliability of mainstream views so that people can work out the implications on their own without there being a need for explicit public discussion?
A natural method to avoid becoming a crackpot is to reveal one’s views for possible critique in a gradual and carefully argued fashion, adjusting them as people point out weaknesses. Of course it might not be a good idea to reveal one’s views regardless (self-preservation; opportunity cost of time) but I don’t think that danger of being a crackpot is a good reason.
I’m not sure what you have in mind here. Your post titled Some Heuristics for Evaluating the Soundness of the Academic Mainstream in Unfamiliar Fields was highly upvoted and I myself would be happy to read more along similar lines. Are there examples that you’d point to of attempts to open discussion about these matters falling flat?
Basically, I believe that exploring the general questions about how mainstream views are generated in practice and what are the implications for their reliability is by far the most fruitful direction for people interested in increasing the accuracy of their beliefs across the board. Of course, if you have a particular interest in some question, you have to grapple with the concrete issues involved, and also a general exploration must be based on concrete case studies. But attacking particular mainstream views head-on may well be counterproductive in every sense, as I noted above.
That’s assuming you have discussion partners who are knowledgeable, open-minded, and patient enough. However, such people are the most difficult to find exactly in those areas where you’re faced with the Scylla of a deeply flawed mainstream and the Charybdis of even worse crackpot contrarians.
(Please also see my reply to Nick Tarleton, who asked a similar question as the rest of your comment.)
This is fair; you’ve made your position clear, thanks.
Agree in general. How about Less Wrong in particular?
Well, LW is great for discussing a concrete problem if you manage to elicit some interest in it, both because of people’s high general intellectual skills and because of low propensity to emotionally driven reactions that are apt to derail the discussion, even in fairly charged topics (well, except for gender-related ones, I guess). So, yes, LW is very good for this sort of reality-checking if you manage to find people interested in your topic.
What’s an example? (I mostly ask so as to have some more specific idea of what topics you’re referring to.)
You can take any topic where it’s impossible to make sense of the existing academic literature (and other influential high-status sources), or where the respectable mainstream consensus seems to clash with reality. When discussions about such topics are opened on LW, often the logical next step would be to ask about the more general underlying problems that give rise to these situations, instead of just focusing on the arguments about particular problems in isolation. (And even without a concrete motivation, such questions should directly follow from LW’s mission statement.) Yet I see few, if any attempts to ask such general questions on LW, and my occasional attempts to open discussion along these lines, even when highly upvoted, don’t elicit much in terms of interesting arguments and insight.
As an illustration, we can take an innocent and mainstream problematic topic like e.g. the health questions of lifestyle such as nutrition, exercise, etc. These topics have been discussed on LW many times, and it seems evident that the mainstream academic literature is a complete mess, with potential gems of useful insight buried under mountains of nonsense work, and authoritative statements of expert opinion given without proper justification. Yet I see no attempt to ask a straightforward follow-up question: since these areas operate under the official bureaucratic system that’s supposed to be guaranteed to produce valid science, then what exactly went wrong? And what implications does it have for other areas where we take the official output of this same bureaucratic system as ironclad evidence?
Of course, when it comes to topics that are more dangerous and ideologically charged, the underlying problems are likely to be different and more severe. One can reasonably argue that such topics are best avoided on a forum like LW, both because they’re likely to stir up bad blood and because of the potential bad signaling and reputational consequences for the forum as an institution. But even if we take the most restrictive attitude towards such topics, there are still many others that can be used as case studies for gaining insight about the systematic underlying problems.
Your own points have struck me as on the mark; but I haven’t had much to add.
There are some interesting general comments that I could make based on my experience in the mathematical community in particular. I guess here I have some tendency toward self-preservation myself; I don’t want to offend acquaintances who might be cast in negative life by my analysis. (Would be happy to share my views privately if you’re interested though.) I guess my attitude here is that there’s little upside to making my remarks public. The behaviors that I perceive to be dysfunctional are sufficiently deeply entrenched so that whatever I would say would have little expected value.
The main upside would be helping others attain intellectual enlightenment, but although I myself greatly enjoy the satisfaction of being intellectual enlightenment, I’m not sure that intellectual enlightenment is very valuable from a global perspective. Being right is of little use without being influential. In general the percentage of people who are right (or interested in being right) on a given topic where a contrarian position is right is sufficiently small so that the critical mass that it would take to change things isn’t there and nor would an incremental change in this percentage make a difference.
The reason why the above point has so much weight in my mind is that despite my very high interest in learning about a variety of things and in forming accurate views on a variety of subjects; I haven’t achieved very much. It’s not clear whether having accurate views of the world has been more helpful or harmful to me in achieving my goals. The jury is still very much out and things may change; but the very fact that it’s possible for me to have this attitude is a strong indication that knowledge and accurate views on a variety of things can be useless on their own.
Regarding:
I made a comment that you may find relevant here; I would characterize nutrition/exercise/etc. as fields that are obviously important and which therefore attract many researchers/corporations/hobbyists/etc. having the effect of driving high quality of researchers out of the field on account of bad associations.
Another factor may be absence of low hanging fruit (which you reference in your top level post); it could be that the diversity of humans is sufficiently great so that it’s difficult to make general statements about what’s healthy/unhealthy.
I agree with what you said about main stream fields being diluted, but offer an interesting corollary to that. Economic motives compel various gurus and nutritionists to make claims to the average joe, and the average joe, or even the educated joe cannot sort through them. However, if one looks in more narrow fields, one can obtain more specific answers without so much trash. For example, powerlifting. This is not a huge market nor one you can benefit financially from that much. If one is trying to sell something or get something published, he can’t just say “I pretty much agree with X”, he needs to somehow distinguish himself. But when that motive is eliminated you can get more consistency in recommendations and have a greater chance to actually hit upon what works.
While you might not be interested in powerlifting, reading in more niche areas can help filter out profit/status seeking charlatans, and can allow one to see the similarities across disciplines. So while I’ve read about bodybuilding, powerlifting, and endurance sports, and their associated nutritional advice, I would never read a book about “being fit.”
As an aside, I recently had this horrible moment of realization. Much of the fitness advise given out is just so incredibly wrong, and I am able to realize that because I have a strong background in that subject. But I realized, 90% of the stuff I read about are areas I don’t have a great background in. I could be accepting really wrong facts in other areas that are just as wrong as the nutritional facts I scoff at, and I would never learn of my error.
That does seem to be a useful heuristic. DOOM mongers are usually selling something. They typically make exaggerated and biased claims. The SIAI and FHI do not seem to be significant exceptions to this—though their attempts to be scientific and rational certainly help.
These types of organisation form naturaly from those with the largest p(DOOM) estimates. That is not necessarily the best way to obtain an unbiased estimate. If you run into organistions who are trying to convince you that the end of the world is nigh—and you should donate to help them save it—you should at least be aware that this pattern is an ancient one with a dubious pedigree.
I am inclined to ask for references. As far as I understand it there is a real science Cryogenics—which goes out of its way to distance itself from its more questionable cousin (cryonics) - which has a confusingly-similar name. Much as psychology tries to distinguish itself from psychiatry. Is there much more than that going on here?
From what I understand, the professional learned society of cryobiologists has an official policy that bans any engagement with cryonics to its members under the pain of expulsion (which penalty would presumably have disastrous career implications). Therefore, cryobiologists are officially mandated to uphold this party line and condemn cryonics, if they are to speak on the subject at all. From what I’ve seen, cryonics people have repeatedly challenged this position with reasonable arguments, but they haven’t received anything like satisfactory rebuttals that would justify the official position. (See more details in this post, whose author has spent considerable effort searching for such rebuttal.)
Now, for all I know, it may well be that the claims of cryonicists are complete bunk after all. The important point is that here we see a clear and unambiguous instance of the official academic mainstream upholding an official line that is impervious to rational argument, and attempts to challenge this official line elicit sneering and stonewalling rather than any valid response. One of my claims in this discussion is that this is far from being the only such example (although the official positions and the condemnations of dissenters are rarely spelled out so explicitly), and LW people familiar with this example should take it as a significant piece of evidence against trusting the academic mainstream consensus in general.
This seems to be the relevant bit:
It says they are not allowed to perform or promote freezing of “deceased persons”—citing concerns over ethical and scientific standards—and its own reputation. They probably want to avoid them and their members being associated with cryonics scandals and lawsuits.
As I said, I don’t have a dog in this particular fight, and for all I know, the cryobiologists’ rejection of cryonics might in fact be justified, for both reasons of science and pragmatist political considerations. However, the important point is that if you ask, in a polite, reasonable, and upfront manner, for a scientific assessment of cryonics and what exactly are the problems with it, it is not possible to get a full, honest, and scientifically sound answer, as demonstrated by that article to which I linked above. Contrast this with what happens if you ask, say, physicists what is wrong with some crackpot theory of physics—they will spell out a detailed argument showing what exactly is wrong, and they will be able to answer any further questions you might have and fully clarify any confusion, as long as you’re not being impervious to argument.
Regardless of any particular concern about cryonics, the conclusion to draw from this is that a strong mainstream academic consensus sometimes rests on a rock-solid foundation that can be readily examined if you just invest some effort, but sometimes this is not the case, at the very least because for some questions there is no way to even get a clear and detailed statement on what exactly this foundation is supposed to be. From this, it is reasonable to conclude that mainstream academic consensus should not be taken as conclusive evidence for anything—and in turn, contrarian opinions should not be automatically discarded just because mainstream academics reject them -- unless you have some reliable criteria for evaluating how solid its foundation is in a particular area. The case of cryonics is relevant for my argument only insofar as this is a question where lots of LW people have run into a strong mainstream consensus for which it’s impossible to get a solid justification, thus providing one concrete example that shouldn’t be too controversial here.
I think most parties involved agree that cryonic revival is a bit of long shot. It is hard to say exactly how much of a long shot it is—since that depends on speculative far-future things like whether an advanced civilization will be sufficiently interested enough in us to revive us. Scientists can’t say too much about that—except that there are quite a few unknown unknowns—and so to have wide confidence intervals.
definitely and thank you for sharing your list :)
This comment seems to be influenced by an association fallacy.
The fact that someone has suffered doesn’t imply that they are rational or irrational.
If you acknowledge evidence that someone is being irrational it doesn’t mean you have to deny they have any positive qualities or be unsympathetic about their problems.
I made no claim that they were rational—or that they had no nice qualities.
I was trying to point out that “I think that other people should not spend time on X because it is an unimportant topic to me” is failing to understand that to these people, X is not an unimportant topic.
“some examples would be nice”
Some examples would agitate a thundering herd of heretic hunters and witch finders who would vote down every post that Vladimir has ever made or will ever make.
This dosen’t seem overwhelmingly likley false. Its not a very nice things to say, but why should that matter?
Why so many down votes? In any case since I’m Charlie Sheen I don’t care if I’m down voted since I’m WINNING no matter what they say (wow memetic brainfreeze).
It does to me. Less Wrong isn’t so popular that I’d expect a herd of people to bother downvoting each of Vladimir_M’s dozens of posts, then wait around for him to make more posts to downvote, just because of a few examples. The fact that Vladimir_M gave two or three examples but still has non-negative scores for his recent comments is more evidence that sam0345 was wrong.
Quite a few LW users see niceness as a useful norm and may have voted accordingly. At any rate, I don’t think it was a lack of niceness that provoked people to downvote that comment; I’d guess it was because they read it as an unhelpful overstatement.