Also, you’re right about the celestial: In the western world religion has become quite low-hanging fruit by now—so low it practically touches the ground.
In most cases, when I see that someone is a particularly passionate and dedicated atheist (in the sense of the “New Atheists” etc.), lacking other information, I take it as strong evidence against their rationality. For someone living in the contemporary Western world who wants to fight against widespread and dangerous irrational beliefs, focusing on traditional religion indicates extreme bias and total blindness towards various delusions that are nowadays infinitely more pernicious and malignant than anything coming out of any traditional religion. (The same goes for those “skeptics” who relentlessly campaign against low-status folk superstition like UFOs or crystal healing, but would never dare mutter anything against all sorts of horrendous delusions that enjoy high status and academic approval.)
I like to compare such people with someone on board the Titanic who loses sleep over petty folk superstitions of the passengers, while at the same time being blissfully happy with the captain’s opinions about navigation. (And, to extend the analogy, often even attacking those who question the captain’s competence as dangerous crackpots.)
I take [passionate atheism/skepticism] as strong evidence against their rationality
Why? Just because they spend their time in a perhaps less than optimal manner (compared to existential risks) doesn’t automatically mean that passionate atheists and skeptics are somehow highly irrational people, does it? I suspect a lot of them would be potential lesswrong readers, it’s just that they haven’t yet encountered these ideas yet.
Most people first had to become “regular” rationalists before they became lesswrongers. If I had stumbled upon this website a looong time ago when I was still something along the lines of a New Ager, I strongly suspect the Bayesian rationality meme simply would not have fallen on fertile ground. Cleaning out the superstitious garbage from your mind seems to be quite an important step for many people. It certainly was for me.
I do not agree with your viewpoint that these people are entirely wasting their time. Not every man, woman and child can participate directly or indirectly in the development of friendly AGI—and I’ve seen much worse use of time and effort than conversion attempts by the “New Atheist” movement. After all, something we may want to keep in mind is that the success and failure of many futuristic things we discuss here on lesswrong may somewhat depend on public opinion and perception (think stem cells) - and I’d much rather face at least somewhat rational atheists than a bunch of deluded theists and esoterics. The difference between 10 and 20% atheists may be all the difference it takes, to achieve more positive outcomes in certain scenarios.
Furthermore, if lesswrongian though has any kind of easily identifiable target group that would be worth “advertising” to, you’d probably find it among skeptics and atheists.
Just because they spend their time in a perhaps less than optimal manner (compared to existential risks) doesn’t automatically mean that passionate atheists and skeptics are somehow highly irrational people, does it?
I didn’t say it was conclusive evidence, only that it is strong evidence.
Moreover, the present neglect of technology-related existential (and other) risk is only one example where the respectable opinion is nowadays remote from reality. There are many other questions where the prevailing views of academic experts, intellectuals, and other high-status shapers of public opinion, are, in my opinion, completely delusional. Some of these are just theoretical questions without much practical bearing on anything, but others have real ugly consequences on a large scale, up to and including mass death and destruction, or seriously threaten such consequences in the future. Many of them also make the world more ugly and dysfunctional, and life more burdensome and joyless, in countless little ways; others are presented as enlightened wisdom on how to live your life but are in fact a recipe for disaster for most people who might believe and try to apply them.
In this situation, if someone focuses on traditional religion as a supposedly especially prominent source of false beliefs and irrationality, it is likely that this is due to ideological reasons, which in turn means that they also swallow much of the above mentioned respectable delusions. Again, there are exceptions, which is why I wrote “lacking other information.” But this is really true in most cases.
Also, another devilish intellectual hack that boosts many modern respectable delusions is the very notion of separating “religious” beliefs and opinions from others. Many modern ideological beliefs that are no less metaphysical and irrational than anything found in traditional religions can nevertheless be advertised as rational and objective—and in turn backed and enforced by governments and other powerful institutions without violating the “separation of church and state” -- just because they don’t fall under the standard definition of “religion.” In my experience, and again with a few honorable exceptions, those who advocate against traditional religion are often at the same time entirely OK with such enforcement of state-backed ideology, even though there is no rational reason to see it as essentially different from the old-fashion establishment of religion.
There are many other questions where the prevailing views of academic experts, intellectuals, and other high-status shapers of public opinion, are, in my opinion, completely delusional.
Name three?
edit: I find that he has already named three, and two heuristics for determining whether an academic field is full of bunk or not, here. I commend him on this article. While I remain unconvinced on the general strategy outlined, I now understand the sort of field he is discussing and find that, on the specifics, I tentatively agree.
I strongly recommend reading Robin Hanson’s answer here.
Many modern ideological beliefs that are no less metaphysical and irrational than anything found in traditional religions can nevertheless be advertised as rational and objective—and in turn backed and enforced by governments and other powerful institutions without violating the “separation of church and state”—just because they don’t fall under the standard definition of “religion.”
Well, as I pointed out in my other comments, unless I answered your challenges with essays of enormous length, my answer would consist of multiple assertions without supporting evidence that sound outlandish on the face of it. Remember that we are talking about delusions that are presently shared by the experts and/or respectable high-status people.
Note that you should accept my point even if we completely disagree on what these high-status delusions are, as long as we agree that there are some, whatever they might be. Try to focus on the main point in the abstract: if delusion X is low-status and rejected by experts and high-status people (even if it might be fairly widespread among the common folk), while delusion Y is instead accepted by them, so much that by asserting non-Y you risk coming off as a crackpot, should we be more worried about X or Y, in terms of both the idealistic pursuit of truth and the practical problems that follow?
Try to focus on the main point in the abstract: if delusion X is low-status and rejected by experts and high-status people (even if it might be fairly widespread among the common folk), while delusion Y is instead accepted by them, so much that by asserting non-Y you risk coming off as a crackpot, should we be more worried about X or Y, in terms of both the idealistic pursuit of truth and the practical problems that follow?
Y, of course. Perhaps I should have started out by saying that while I agree that what you say is possible, I don’t know if it describes the real world. Your assertion was that there are many high status delusions, but without evidence of that, all I can say is that I agree that supposed experts are not guaranteed to be correct on every point, and that it is extremely possible that they will reinforce delusions within their community.
I think a lot of the people that fall into this camp (at least those that I know of) are people that have just recently deconverted—they’ve just been through a major life-change involving religion and therefore are understandably entranced with the whole process as it is particularly meaningful to them.
Alternatively, they are reacting against some heavy prejudice that they have had to suffer through—or have some loved ones that are particularly “afflicted” and want to see something done to prevent it happening to others.
Sure, there are other big, important things out there… but one man’s meat is another’s poison, and all that.
I think it’s easy enough to say that there are bigger problems out there… when we are looking at it from the perspective of having been atheist for a long time. but some people have just had their world cave in—everything has been upturned. They no longer have that huge big safety net underneath them that tells them that everything is going to be alright in the afterlife. Maybe they’ve just discovered that they’ve been wasting one seventh of their life in church when they could have been out there exploring his beautiful world that we live in or spending quality time with their kids… it may seem like nothing important to you, but it’s a Big Thing to some people.
PS—I am also inclined to agree with you that there are better things the time could be spent on… but that’s “better from my perspective” and it’s not mine that counts.
Well, whenever I open this topic, giving concrete examples is problematic, since these are by definition respectable and high-status delusions, so it’s difficult or impossible to contradict them without sounding like a crackpot or extremist.
There are however a few topics where prominent LW participants have run into such instances of respectable opinion being dogmatic and immune to rational argument. On example is the already mentioned neglect of technology-related existential risks—as well as other non-existential but still scary threats that might be opened due to the upcoming advances in technology—and the tendency to dismiss people who ask such questions as crackpots. Another is the academic and medical establishment’s official party line against cryonics, which is completely impervious to any argument. (I have no interest in cryonics myself, but the dogmatic character of the official line is clear, as well as its lack of solid foundation.)
This, however, is just the tip of the iceberg. Unfortunately, listing other examples typically means opening ideologically charged topics that are probably best left alone. One example that shouldn’t be too controversial is economics. We have people in power to regulate and manage things, with enough power and influence to wreak havoc if they don’t know what they’re doing, whose supposed expertise however appears, on independent examination, to consist mostly of cargo-cult science and ideological delusions, even though they bear the most prestigious official titles and accreditations. Just this particular observation should be enough to justify my Titanic allegory.
whenever I open this topic, giving concrete examples is problematic, since these are by definition respectable and high-status delusions, so it’s difficult or impossible to contradict them without sounding like a crackpot or extremist.
On the other hand elsewhere you write
There are many other questions where the prevailing views of academic experts, intellectuals, and other high-status shapers of public opinion, are, in my opinion, completely delusional. Some of these are just theoretical questions without much practical bearing on anything, but others have real ugly consequences on a large scale, up to and including mass death and destruction, or seriously threaten such consequences in the future. Many of them also make the world more ugly and dysfunctional, and life more burdensome and joyless, in countless little ways; others are presented as enlightened wisdom on how to live your life but are in fact a recipe for disaster for most people who might believe and try to apply them.
which suggests that you think that the things that you’re avoiding writing about are very important. If they’re so important then why not pay the price of being considered a crackpot/extremist by some in order to fight against the delusional views? Is the key issue self-preservation of the type that you mentioned in response to Komponisto?
Also, arguing on the internet under one’s real identity is a bad idea for anyone who isn’t in one of these four categories...
Or is the point that you think that there’s not much hope for changing people’s views on the questions that you have in mind so that it’s futile to try?
Well, there are several reasons why I’m not incessantly shouting all my contrarian views from the rooftops.
For start, yes, obviously I am concerned with the possible reputational consequences. But even ignoring that, the problem is that arguing for contrarian views may well have the effect of making them even more disreputable and strengthening the mainstream consensus, if it’s done in a way that signals low status, eccentricity, immorality, etc., or otherwise enables the mainstream advocates to score a rhetorical victory in the ensuing debate (regardless of the substance of the arguments). Thus, even judging purely by how much you’re likely to move people’s opinions closer or further from the truth, you should avoid arguing for contrarian views unless the situation seems especially favorable, in the sense that you’ll be able to present your case competently and in front of a suitable audience.
Moreover, there is always the problem of whether you can trust your own contrarian opinions. After all, even if you take the least favorable view of the respectable opinion and the academic mainstream, it is still the case that most contrarians are deluded in even crazier ways. So how do you know that you haven’t in fact become a crackpot yourself? This is why rather than making a piecemeal catalog of delusional mainstream views, I would prefer to have a more general framework for estimating how reliable the mainstream opinion is likely to be on a particular subject given various factors and circumstances, and what general social, economic, political, and other mechanisms have practical influence in this regard. Effort spent on obtaining such insight is, in my opinion, far more useful than attacking seemingly wrong mainstream opinions one by one.
These latter questions should, in my opinion, be very high (if not on the top) of the list of priorities of people who are concerned with overcoming bias and increasing their rationality and the accuracy of their beliefs, and one of my major disappointments with LW is that attempts to open discussion about these matters invariably fall flat. (This despite the fact that such discussions could be productive even without opening any especially dangerous and charged topics, and despite the fact that on LW one regularly hears frustrated accounts of the mainstream being impervious to argument on topics such as existential risk or cryonics. I find it especially puzzling that smart people who are concerned about the latter have no interest in investigating the underlying more general and systematic problems.)
Thus, even judging purely by how much you’re likely to move people’s opinions closer or further from the truth, you should avoid arguing for contrarian views unless the situation seems especially favorable, in the sense that you’ll be able to present your case competently and in front of a suitable audience.
Doesn’t this push in the direction of holding contrarian views being useless except as a personal hobby? If so, why argue against mainstream delusional views at all (even as a collection without specifying what they are)? Is the point of your comment that you think it’s possible to make progress by highlighting broad phenomena about the reliability of mainstream views so that people can work out the implications on their own without there being a need for explicit public discussion?
Moreover, there is always the problem of whether you can trust your own contrarian opinions. After all, even if you take the least favorable view of the respectable opinion and the academic mainstream, it is still the case that most contrarians are deluded in even crazier ways. So how do you know that you haven’t in fact become a crackpot yourself?
A natural method to avoid becoming a crackpot is to reveal one’s views for possible critique in a gradual and carefully argued fashion, adjusting them as people point out weaknesses. Of course it might not be a good idea to reveal one’s views regardless (self-preservation; opportunity cost of time) but I don’t think that danger of being a crackpot is a good reason.
These latter questions should, in my opinion, be very high (if not on the top) of the list of priorities of people who are concerned with overcoming bias and increasing their rationality and the accuracy of their beliefs, and one of my major disappointments with LW is that attempts to open discussion about these matters invariably fall flat.
Is the point of your comment that you think it’s possible to make progress by highlighting broad phenomena about the reliability of mainstream views so that people can work out the implications on their own without there being a need for explicit public discussion?
Basically, I believe that exploring the general questions about how mainstream views are generated in practice and what are the implications for their reliability is by far the most fruitful direction for people interested in increasing the accuracy of their beliefs across the board. Of course, if you have a particular interest in some question, you have to grapple with the concrete issues involved, and also a general exploration must be based on concrete case studies. But attacking particular mainstream views head-on may well be counterproductive in every sense, as I noted above.
A natural method to avoid becoming a crackpot is to reveal one’s views for possible critique in a gradual and carefully argued fashion, adjusting them as people point out weaknesses. Of course it might not be a good idea to reveal one’s views regardless (self-preservation; opportunity cost of time) but I don’t think that danger of being a crackpot is a good reason.
That’s assuming you have discussion partners who are knowledgeable, open-minded, and patient enough. However, such people are the most difficult to find exactly in those areas where you’re faced with the Scylla of a deeply flawed mainstream and the Charybdis of even worse crackpot contrarians.
(Please also see my reply to Nick Tarleton, who asked a similar question as the rest of your comment.)
Basically, I believe that exploring the general questions about how mainstream views are generated in practice and what are the implications for their reliability is by far the most fruitful direction for people interested in increasing the accuracy of their beliefs across the board. Of course, if you have a particular interest in some question, you have to grapple with the concrete issues involved, and also a general exploration must be based on concrete case studies. But attacking particular mainstream views head-on may well be counterproductive in every sense, as I noted above.
This is fair; you’ve made your position clear, thanks.
That’s assuming you have discussion partners who are knowledgeable, open-minded, and patient enough. However, such people are the most difficult to find exactly in those areas where you’re faced with the Scylla of a deeply flawed mainstream and the Charybdis of even worse crackpot contrarians.
Agree in general. How about Less Wrong in particular?
Agree in general. How about Less Wrong in particular?
Well, LW is great for discussing a concrete problem if you manage to elicit some interest in it, both because of people’s high general intellectual skills and because of low propensity to emotionally driven reactions that are apt to derail the discussion, even in fairly charged topics (well, except for gender-related ones, I guess). So, yes, LW is very good for this sort of reality-checking if you manage to find people interested in your topic.
You can take any topic where it’s impossible to make sense of the existing academic literature (and other influential high-status sources), or where the respectable mainstream consensus seems to clash with reality. When discussions about such topics are opened on LW, often the logical next step would be to ask about the more general underlying problems that give rise to these situations, instead of just focusing on the arguments about particular problems in isolation. (And even without a concrete motivation, such questions should directly follow from LW’s mission statement.) Yet I see few, if any attempts to ask such general questions on LW, and my occasional attempts to open discussion along these lines, even when highly upvoted, don’t elicit much in terms of interesting arguments and insight.
As an illustration, we can take an innocent and mainstream problematic topic like e.g. the health questions of lifestyle such as nutrition, exercise, etc. These topics have been discussed on LW many times, and it seems evident that the mainstream academic literature is a complete mess, with potential gems of useful insight buried under mountains of nonsense work, and authoritative statements of expert opinion given without proper justification. Yet I see no attempt to ask a straightforward follow-up question: since these areas operate under the official bureaucratic system that’s supposed to be guaranteed to produce valid science, then what exactly went wrong? And what implications does it have for other areas where we take the official output of this same bureaucratic system as ironclad evidence?
Of course, when it comes to topics that are more dangerous and ideologically charged, the underlying problems are likely to be different and more severe. One can reasonably argue that such topics are best avoided on a forum like LW, both because they’re likely to stir up bad blood and because of the potential bad signaling and reputational consequences for the forum as an institution. But even if we take the most restrictive attitude towards such topics, there are still many others that can be used as case studies for gaining insight about the systematic underlying problems.
When discussions about such topics are opened on LW, often the logical next step would be to ask about the more general underlying problems that give rise to these situations, instead of just focusing on the arguments about particular problems in isolation. (And even without a concrete motivation, such questions should directly follow from LW’s mission statement.) Yet I see few, if any attempts to ask such general questions on LW, and my occasional attempts to open discussion along these lines, even when highly upvoted, don’t elicit much in terms of interesting arguments and insight.
Your own points have struck me as on the mark; but I haven’t had much to add.
There are some interesting general comments that I could make based on my experience in the mathematical community in particular. I guess here I have some tendency toward self-preservation myself; I don’t want to offend acquaintances who might be cast in negative life by my analysis. (Would be happy to share my views privately if you’re interested though.) I guess my attitude here is that there’s little upside to making my remarks public. The behaviors that I perceive to be dysfunctional are sufficiently deeply entrenched so that whatever I would say would have little expected value.
The main upside would be helping others attain intellectual enlightenment, but although I myself greatly enjoy the satisfaction of being intellectual enlightenment, I’m not sure that intellectual enlightenment is very valuable from a global perspective. Being right is of little use without being influential. In general the percentage of people who are right (or interested in being right) on a given topic where a contrarian position is right is sufficiently small so that the critical mass that it would take to change things isn’t there and nor would an incremental change in this percentage make a difference.
The reason why the above point has so much weight in my mind is that despite my very high interest in learning about a variety of things and in forming accurate views on a variety of subjects; I haven’t achieved very much. It’s not clear whether having accurate views of the world has been more helpful or harmful to me in achieving my goals. The jury is still very much out and things may change; but the very fact that it’s possible for me to have this attitude is a strong indication that knowledge and accurate views on a variety of things can be useless on their own.
The best cure against such prideful attitudes is to ask yourself what you have to show in terms of practical accomplishments and status if you’re so much more rational and intellectually advanced than ordinary people. If they are so stupid and delusional to be deserving of such intolerance and contempt, then an enlightened and intellectually superior person should be able to run circles around them and easily come out on top, no?
Regarding:
As an illustration, we can take an innocent and mainstream problematic topic like e.g. the health questions of lifestyle such as nutrition, exercise, etc. These topics have been discussed on LW many times, and it seems evident that the mainstream academic literature is a complete mess, with potential gems of useful insight buried under mountains of nonsense work, and authoritative statements of expert opinion given without proper justification. Yet I see no attempt to ask a straightforward follow-up question: since these areas operate under the official bureaucratic system that’s supposed to be guaranteed to produce valid science, then what exactly went wrong? And what implications does it have for other areas where we take the official output of this same bureaucratic system as ironclad evidence?
I made a comment that you may find relevant here; I would characterize nutrition/exercise/etc. as fields that are obviously important and which therefore attract many researchers/corporations/hobbyists/etc. having the effect of driving high quality of researchers out of the field on account of bad associations.
Another factor may be absence of low hanging fruit (which you reference in your top level post); it could be that the diversity of humans is sufficiently great so that it’s difficult to make general statements about what’s healthy/unhealthy.
I agree with what you said about main stream fields being diluted, but offer an interesting corollary to that. Economic motives compel various gurus and nutritionists to make claims to the average joe, and the average joe, or even the educated joe cannot sort through them. However, if one looks in more narrow fields, one can obtain more specific answers without so much trash. For example, powerlifting. This is not a huge market nor one you can benefit financially from that much. If one is trying to sell something or get something published, he can’t just say “I pretty much agree with X”, he needs to somehow distinguish himself. But when that motive is eliminated you can get more consistency in recommendations and have a greater chance to actually hit upon what works.
While you might not be interested in powerlifting, reading in more niche areas can help filter out profit/status seeking charlatans, and can allow one to see the similarities across disciplines. So while I’ve read about bodybuilding, powerlifting, and endurance sports, and their associated nutritional advice, I would never read a book about “being fit.”
As an aside, I recently had this horrible moment of realization. Much of the fitness advise given out is just so incredibly wrong, and I am able to realize that because I have a strong background in that subject. But I realized, 90% of the stuff I read about are areas I don’t have a great background in. I could be accepting really wrong facts in other areas that are just as wrong as the nutritional facts I scoff at, and I would never learn of my error.
One example is the already mentioned neglect of technology-related existential risks—as well as other non-existential but still scary threats that might be opened due to the upcoming advances in technology—and the tendency to dismiss people who ask such questions as crackpots.
That does seem to be a useful heuristic. DOOM mongers are usually selling something. They typically make exaggerated and biased claims. The SIAI and FHI do not seem to be significant exceptions to this—though their attempts to be scientific and rational certainly help.
These types of organisation form naturaly from those with the largest p(DOOM) estimates. That is not necessarily the best way to obtain an unbiased estimate. If you run into organistions who are trying to convince you that the end of the world is nigh—and you should donate to help them save it—you should at least be aware that this pattern is an ancient one with a dubious pedigree.
Another is the academic and medical establishment’s official party line against cryonics, which is completely impervious to any argument. (I have no interest in cryonics myself, but the dogmatic character of the official line is clear, as well as its lack of solid foundation.)
I am inclined to ask for references. As far as I understand it there is a real science Cryogenics—which goes out of its way to distance itself from its more questionable cousin (cryonics) - which has a confusingly-similar name. Much as psychology tries to distinguish itself from psychiatry. Is there much more than that going on here?
From what I understand, the professional learned society of cryobiologists has an official policy that bans any engagement with cryonics to its members under the pain of expulsion (which penalty would presumably have disastrous career implications). Therefore, cryobiologists are officially mandated to uphold this party line and condemn cryonics, if they are to speak on the subject at all. From what I’ve seen, cryonics people have repeatedly challenged this position with reasonable arguments, but they haven’t received anything like satisfactory rebuttals that would justify the official position. (See more details in this post, whose author has spent considerable effort searching for such rebuttal.)
Now, for all I know, it may well be that the claims of cryonicists are complete bunk after all. The important point is that here we see a clear and unambiguous instance of the official academic mainstream upholding an official line that is impervious to rational argument, and attempts to challenge this official line elicit sneering and stonewalling rather than any valid response. One of my claims in this discussion is that this is far from being the only such example (although the official positions and the condemnations of dissenters are rarely spelled out so explicitly), and LW people familiar with this example should take it as a significant piece of evidence against trusting the academic mainstream consensus in general.
From what I understand, the professional learned society of cryobiologists has an official policy that bans any engagement with cryonics to its members under the pain of expulsion (which penalty would presumably have disastrous career implications). Therefore, cryobiologists are officially mandated to uphold this party line and condemn cryonics, if they are to speak on the subject at all.
Upon a two-thirds vote of the Governors in office, the Board of Governors may refuse membership to applicants, or suspend or expel members (including both individual and institutional members), whose conduct is deemed detrimental to the Society, including applicants or members engaged in or who promote any practice or application which the Board of Governors deems incompatible with the ethical and scientific standards of the Society or as misrepresenting the science of cryobiology, including any practice or application of freezing deceased persons in anticipation of their reanimation. [Sec. 2.04 of the bylaws of the Society for Cryobiology].
It says they are not allowed to perform or promote freezing of “deceased persons”—citing concerns over ethical and scientific standards—and its own reputation. They probably want to avoid them and their members being associated with cryonics scandals and lawsuits.
As I said, I don’t have a dog in this particular fight, and for all I know, the cryobiologists’ rejection of cryonics might in fact be justified, for both reasons of science and pragmatist political considerations. However, the important point is that if you ask, in a polite, reasonable, and upfront manner, for a scientific assessment of cryonics and what exactly are the problems with it, it is not possible to get a full, honest, and scientifically sound answer, as demonstrated by that article to which I linked above. Contrast this with what happens if you ask, say, physicists what is wrong with some crackpot theory of physics—they will spell out a detailed argument showing what exactly is wrong, and they will be able to answer any further questions you might have and fully clarify any confusion, as long as you’re not being impervious to argument.
Regardless of any particular concern about cryonics, the conclusion to draw from this is that a strong mainstream academic consensus sometimes rests on a rock-solid foundation that can be readily examined if you just invest some effort, but sometimes this is not the case, at the very least because for some questions there is no way to even get a clear and detailed statement on what exactly this foundation is supposed to be. From this, it is reasonable to conclude that mainstream academic consensus should not be taken as conclusive evidence for anything—and in turn, contrarian opinions should not be automatically discarded just because mainstream academics reject them -- unless you have some reliable criteria for evaluating how solid its foundation is in a particular area. The case of cryonics is relevant for my argument only insofar as this is a question where lots of LW people have run into a strong mainstream consensus for which it’s impossible to get a solid justification, thus providing one concrete example that shouldn’t be too controversial here.
However, the important point is that if you ask, in a polite, reasonable, and upfront manner, for a scientific assessment of cryonics and what exactly are the problems with it, it is not possible to get a full, honest, and scientifically sound answer, as demonstrated by that article to which I linked above. Contrast this with what happens if you ask, say, physicists what is wrong with some crackpot theory of physics—they will spell out a detailed argument showing what exactly is wrong, and they will be able to answer any further questions you might have and fully clarify any confusion, as long as you’re not being impervious to argument.
I think most parties involved agree that cryonic revival is a bit of long shot. It is hard to say exactly how much of a long shot it is—since that depends on speculative far-future things like whether an advanced civilization will be sufficiently interested enough in us to revive us. Scientists can’t say too much about that—except that there are quite a few unknown unknowns—and so to have wide confidence intervals.
This comment seems to be influenced by an association fallacy.
The fact that someone has suffered doesn’t imply that they are rational or irrational.
If you acknowledge evidence that someone is being irrational it doesn’t mean you have to deny they have any positive qualities or be unsympathetic about their problems.
I made no claim that they were rational—or that they had no nice qualities.
I was trying to point out that “I think that other people should not spend time on X because it is an unimportant topic to me” is failing to understand that to these people, X is not an unimportant topic.
Some examples would agitate a thundering herd of heretic hunters and witch finders who would vote down every post that Vladimir has ever made or will ever make.
This dosen’t seem overwhelmingly likley false. Its not a very nice things to say, but why should that matter?
Why so many down votes? In any case since I’m Charlie Sheen I don’t care if I’m down voted since I’m WINNING no matter what they say (wow memetic brainfreeze).
It does to me. Less Wrong isn’t so popular that I’d expect a herd of people to bother downvoting each of Vladimir_M’s dozens of posts, then wait around for him to make more posts to downvote, just because of a few examples. The fact that Vladimir_M gave two or three examples but still has non-negative scores for his recent comments is more evidence that sam0345 was wrong.
Its not a very nice things to say, but why should that matter?
Quite a few LW users see niceness as a useful norm and may have voted accordingly. At any rate, I don’t think it was a lack of niceness that provoked people to downvote that comment; I’d guess it was because they read it as an unhelpful overstatement.
I believe that a lot of what’s wrong with the world comes from taking governments too seriously. The historical argument for atheism—the damage done by religion—applies at least as strongly to governments.
This doesn’t mean I think it necessarily makes sense for individuals to conspicuously ignore a government which is dangerous to them. To put it mildly, there are group effects.
Taking governments too seriously in what sense? Adopting values implicit in in government rhetoric? Following laws? Give some examples if you’d like.
Also, are you considering the counterfactual here? Without religion there’s atheism. What happens when people don’t take governments too seriously? It’s actually unclear to me that religion does more harm than good; I would guess that the harm done apparently done by religion is largely due to general human nature and that there are upsides of organic community so that on balance it’s a wash.
I noticed that, while there used to be religious wars, for the most part these days, what gets people to die for no good reason is nationalism.
I’m not sure what the best attitude is—I don’t think we can dispense with government these days, but on the other hand, I don’t think law-abidingness and patriotism should be put very high on the list of virtues.
I noticed that, while there used to be religious wars, for the most part these days, what gets people to die for no good reason is nationalism.
Don’t both of religion and nationalism fall under the broader umbrella of tribalism? It’s plausible to me that without either one there would be some other sort of tribalism with adverse effects on global welfare. People might not die as a result but I don’t think that there’s reason to think that the aggregate negative effect would be smaller.
Now, if what replaced them was some sort of ideology of the type “equal consideration for all” that filled the vaccum left over by politics/religion then that would be different. I have little sense for how likely this would be.
I’m not sure what the best attitude is—I don’t think we can dispense with government these days, but on the other hand, I don’t think law-abidingness and patriotism should be put very high on the list of virtues.
What sort of law-abidingness do you have in mind here? Obeying military drafts?
It’s actually unclear to me that religion does more harm than good
For quick and dirty empirical evidence, look at latest european poll
Do countries on the top of table with least belief in God, spirit or life force behave more rationally?
As someone who is both into the skeptics movement and the atheist movement i’m not sure what skeptics “wouldn’t dare mutter” about. It seems to me that skeptics and atheists just have and interest in those things and want to stop the harm caused by them.
Also, i must be ignorant about all these other horrible delusions you are talking about.
Further you must be talking about instrumental rationality because i’m not sure how this is evidence against epistemic rationality.
I may have been too harsh on the skeptics, some of whom occasionally do attack nonsense in a way that riles up not just crackpots, but also some highly respectable and even academically accredited ideologues and charlatans. However, the problem I see is the main thrust of the movement, which assumes that dangerous nonsense that should be attacked and debunked is practically always purveyed and followed by people outside the official, respectable, accredited mainstream, with the implicit assumption that the latter is maybe imperfect, but still without any deep and horrendous flaws, and in matters where it produces strong consensus, we have nothing much to worry about.
This is where my Titanic analogy comes in. When I read about skeptics debunking people like, say, Uri Geller or Erich von Daeniken, clearly I have no objection to the substance of their work—on the contrary. However, if such people are left unchecked, it’s not like they will tomorrow be awarded high places in the government and the academia, and be given the power to propagandize their views with high official authority, both in classrooms and in mass media that would cite them as authorities, to write laws and regulations based on their delusions, to promote (and aggressively impose) their views through international institutions and foreign policy, etc., etc., with all the disastrous consequences that may follow from that. Therefore, shouldn’t a rational person be more concerned with the possible delusions of people who do have such power and authority? They are the ones presently in charge of steering the ship, after all, and it’s not like there aren’t any icebergs around.
Of course, if you believe that the official institutions that produce academic consensus and respectable mainstream public opinion are generally OK and not causing any ongoing (or potential future) disasters, clearly these concerns are baseless. But are you really so sure that this optimism is based on a realistic appraisal of the situation?
As someone who is both into the skeptics movement and the atheist movement i’m not sure what skeptics “wouldn’t dare mutter” about.
(...)
Also, i must be ignorant about all these other horrible delusions you are talking about.
As I mentioned elsewhere in this thread, I recommend this post by Quirinus_Quirrell. The list there is by no means comprehensive, but it should give you an idea of what people are talking about.
Edit: Two more good articles to read are this one by Paul Graham, and this post by Vladimir_M.
In most cases, when I see that someone is a particularly passionate and dedicated atheist (in the sense of the “New Atheists” etc.), lacking other information, I take it as strong evidence against their rationality. For someone living in the contemporary Western world who wants to fight against widespread and dangerous irrational beliefs, focusing on traditional religion indicates extreme bias and total blindness towards various delusions that are nowadays infinitely more pernicious and malignant than anything coming out of any traditional religion. (The same goes for those “skeptics” who relentlessly campaign against low-status folk superstition like UFOs or crystal healing, but would never dare mutter anything against all sorts of horrendous delusions that enjoy high status and academic approval.)
I like to compare such people with someone on board the Titanic who loses sleep over petty folk superstitions of the passengers, while at the same time being blissfully happy with the captain’s opinions about navigation. (And, to extend the analogy, often even attacking those who question the captain’s competence as dangerous crackpots.)
Why? Just because they spend their time in a perhaps less than optimal manner (compared to existential risks) doesn’t automatically mean that passionate atheists and skeptics are somehow highly irrational people, does it? I suspect a lot of them would be potential lesswrong readers, it’s just that they haven’t yet encountered these ideas yet.
Most people first had to become “regular” rationalists before they became lesswrongers. If I had stumbled upon this website a looong time ago when I was still something along the lines of a New Ager, I strongly suspect the Bayesian rationality meme simply would not have fallen on fertile ground. Cleaning out the superstitious garbage from your mind seems to be quite an important step for many people. It certainly was for me.
I do not agree with your viewpoint that these people are entirely wasting their time. Not every man, woman and child can participate directly or indirectly in the development of friendly AGI—and I’ve seen much worse use of time and effort than conversion attempts by the “New Atheist” movement. After all, something we may want to keep in mind is that the success and failure of many futuristic things we discuss here on lesswrong may somewhat depend on public opinion and perception (think stem cells) - and I’d much rather face at least somewhat rational atheists than a bunch of deluded theists and esoterics. The difference between 10 and 20% atheists may be all the difference it takes, to achieve more positive outcomes in certain scenarios.
Furthermore, if lesswrongian though has any kind of easily identifiable target group that would be worth “advertising” to, you’d probably find it among skeptics and atheists.
I didn’t say it was conclusive evidence, only that it is strong evidence.
Moreover, the present neglect of technology-related existential (and other) risk is only one example where the respectable opinion is nowadays remote from reality. There are many other questions where the prevailing views of academic experts, intellectuals, and other high-status shapers of public opinion, are, in my opinion, completely delusional. Some of these are just theoretical questions without much practical bearing on anything, but others have real ugly consequences on a large scale, up to and including mass death and destruction, or seriously threaten such consequences in the future. Many of them also make the world more ugly and dysfunctional, and life more burdensome and joyless, in countless little ways; others are presented as enlightened wisdom on how to live your life but are in fact a recipe for disaster for most people who might believe and try to apply them.
In this situation, if someone focuses on traditional religion as a supposedly especially prominent source of false beliefs and irrationality, it is likely that this is due to ideological reasons, which in turn means that they also swallow much of the above mentioned respectable delusions. Again, there are exceptions, which is why I wrote “lacking other information.” But this is really true in most cases.
Also, another devilish intellectual hack that boosts many modern respectable delusions is the very notion of separating “religious” beliefs and opinions from others. Many modern ideological beliefs that are no less metaphysical and irrational than anything found in traditional religions can nevertheless be advertised as rational and objective—and in turn backed and enforced by governments and other powerful institutions without violating the “separation of church and state” -- just because they don’t fall under the standard definition of “religion.” In my experience, and again with a few honorable exceptions, those who advocate against traditional religion are often at the same time entirely OK with such enforcement of state-backed ideology, even though there is no rational reason to see it as essentially different from the old-fashion establishment of religion.
Name three?
edit: I find that he has already named three, and two heuristics for determining whether an academic field is full of bunk or not, here. I commend him on this article. While I remain unconvinced on the general strategy outlined, I now understand the sort of field he is discussing and find that, on the specifics, I tentatively agree.
I strongly recommend reading Robin Hanson’s answer here.
Same challenge.
edit: I would still like to hear these.
Wow answering that challenge might seriously kill some minds.
I suggest you two PM it out.
Well, as I pointed out in my other comments, unless I answered your challenges with essays of enormous length, my answer would consist of multiple assertions without supporting evidence that sound outlandish on the face of it. Remember that we are talking about delusions that are presently shared by the experts and/or respectable high-status people.
Note that you should accept my point even if we completely disagree on what these high-status delusions are, as long as we agree that there are some, whatever they might be. Try to focus on the main point in the abstract: if delusion X is low-status and rejected by experts and high-status people (even if it might be fairly widespread among the common folk), while delusion Y is instead accepted by them, so much that by asserting non-Y you risk coming off as a crackpot, should we be more worried about X or Y, in terms of both the idealistic pursuit of truth and the practical problems that follow?
Y, of course. Perhaps I should have started out by saying that while I agree that what you say is possible, I don’t know if it describes the real world. Your assertion was that there are many high status delusions, but without evidence of that, all I can say is that I agree that supposed experts are not guaranteed to be correct on every point, and that it is extremely possible that they will reinforce delusions within their community.
OOC - some examples would be nice :)
I think a lot of the people that fall into this camp (at least those that I know of) are people that have just recently deconverted—they’ve just been through a major life-change involving religion and therefore are understandably entranced with the whole process as it is particularly meaningful to them.
Alternatively, they are reacting against some heavy prejudice that they have had to suffer through—or have some loved ones that are particularly “afflicted” and want to see something done to prevent it happening to others.
Sure, there are other big, important things out there… but one man’s meat is another’s poison, and all that.
I think it’s easy enough to say that there are bigger problems out there… when we are looking at it from the perspective of having been atheist for a long time. but some people have just had their world cave in—everything has been upturned. They no longer have that huge big safety net underneath them that tells them that everything is going to be alright in the afterlife. Maybe they’ve just discovered that they’ve been wasting one seventh of their life in church when they could have been out there exploring his beautiful world that we live in or spending quality time with their kids… it may seem like nothing important to you, but it’s a Big Thing to some people.
PS—I am also inclined to agree with you that there are better things the time could be spent on… but that’s “better from my perspective” and it’s not mine that counts.
Well, whenever I open this topic, giving concrete examples is problematic, since these are by definition respectable and high-status delusions, so it’s difficult or impossible to contradict them without sounding like a crackpot or extremist.
There are however a few topics where prominent LW participants have run into such instances of respectable opinion being dogmatic and immune to rational argument. On example is the already mentioned neglect of technology-related existential risks—as well as other non-existential but still scary threats that might be opened due to the upcoming advances in technology—and the tendency to dismiss people who ask such questions as crackpots. Another is the academic and medical establishment’s official party line against cryonics, which is completely impervious to any argument. (I have no interest in cryonics myself, but the dogmatic character of the official line is clear, as well as its lack of solid foundation.)
This, however, is just the tip of the iceberg. Unfortunately, listing other examples typically means opening ideologically charged topics that are probably best left alone. One example that shouldn’t be too controversial is economics. We have people in power to regulate and manage things, with enough power and influence to wreak havoc if they don’t know what they’re doing, whose supposed expertise however appears, on independent examination, to consist mostly of cargo-cult science and ideological delusions, even though they bear the most prestigious official titles and accreditations. Just this particular observation should be enough to justify my Titanic allegory.
Why ever not?
On the other hand elsewhere you write
which suggests that you think that the things that you’re avoiding writing about are very important. If they’re so important then why not pay the price of being considered a crackpot/extremist by some in order to fight against the delusional views? Is the key issue self-preservation of the type that you mentioned in response to Komponisto?
Or is the point that you think that there’s not much hope for changing people’s views on the questions that you have in mind so that it’s futile to try?
Well, there are several reasons why I’m not incessantly shouting all my contrarian views from the rooftops.
For start, yes, obviously I am concerned with the possible reputational consequences. But even ignoring that, the problem is that arguing for contrarian views may well have the effect of making them even more disreputable and strengthening the mainstream consensus, if it’s done in a way that signals low status, eccentricity, immorality, etc., or otherwise enables the mainstream advocates to score a rhetorical victory in the ensuing debate (regardless of the substance of the arguments). Thus, even judging purely by how much you’re likely to move people’s opinions closer or further from the truth, you should avoid arguing for contrarian views unless the situation seems especially favorable, in the sense that you’ll be able to present your case competently and in front of a suitable audience.
Moreover, there is always the problem of whether you can trust your own contrarian opinions. After all, even if you take the least favorable view of the respectable opinion and the academic mainstream, it is still the case that most contrarians are deluded in even crazier ways. So how do you know that you haven’t in fact become a crackpot yourself? This is why rather than making a piecemeal catalog of delusional mainstream views, I would prefer to have a more general framework for estimating how reliable the mainstream opinion is likely to be on a particular subject given various factors and circumstances, and what general social, economic, political, and other mechanisms have practical influence in this regard. Effort spent on obtaining such insight is, in my opinion, far more useful than attacking seemingly wrong mainstream opinions one by one.
These latter questions should, in my opinion, be very high (if not on the top) of the list of priorities of people who are concerned with overcoming bias and increasing their rationality and the accuracy of their beliefs, and one of my major disappointments with LW is that attempts to open discussion about these matters invariably fall flat. (This despite the fact that such discussions could be productive even without opening any especially dangerous and charged topics, and despite the fact that on LW one regularly hears frustrated accounts of the mainstream being impervious to argument on topics such as existential risk or cryonics. I find it especially puzzling that smart people who are concerned about the latter have no interest in investigating the underlying more general and systematic problems.)
Doesn’t this push in the direction of holding contrarian views being useless except as a personal hobby? If so, why argue against mainstream delusional views at all (even as a collection without specifying what they are)? Is the point of your comment that you think it’s possible to make progress by highlighting broad phenomena about the reliability of mainstream views so that people can work out the implications on their own without there being a need for explicit public discussion?
A natural method to avoid becoming a crackpot is to reveal one’s views for possible critique in a gradual and carefully argued fashion, adjusting them as people point out weaknesses. Of course it might not be a good idea to reveal one’s views regardless (self-preservation; opportunity cost of time) but I don’t think that danger of being a crackpot is a good reason.
I’m not sure what you have in mind here. Your post titled Some Heuristics for Evaluating the Soundness of the Academic Mainstream in Unfamiliar Fields was highly upvoted and I myself would be happy to read more along similar lines. Are there examples that you’d point to of attempts to open discussion about these matters falling flat?
Basically, I believe that exploring the general questions about how mainstream views are generated in practice and what are the implications for their reliability is by far the most fruitful direction for people interested in increasing the accuracy of their beliefs across the board. Of course, if you have a particular interest in some question, you have to grapple with the concrete issues involved, and also a general exploration must be based on concrete case studies. But attacking particular mainstream views head-on may well be counterproductive in every sense, as I noted above.
That’s assuming you have discussion partners who are knowledgeable, open-minded, and patient enough. However, such people are the most difficult to find exactly in those areas where you’re faced with the Scylla of a deeply flawed mainstream and the Charybdis of even worse crackpot contrarians.
(Please also see my reply to Nick Tarleton, who asked a similar question as the rest of your comment.)
This is fair; you’ve made your position clear, thanks.
Agree in general. How about Less Wrong in particular?
Well, LW is great for discussing a concrete problem if you manage to elicit some interest in it, both because of people’s high general intellectual skills and because of low propensity to emotionally driven reactions that are apt to derail the discussion, even in fairly charged topics (well, except for gender-related ones, I guess). So, yes, LW is very good for this sort of reality-checking if you manage to find people interested in your topic.
What’s an example? (I mostly ask so as to have some more specific idea of what topics you’re referring to.)
You can take any topic where it’s impossible to make sense of the existing academic literature (and other influential high-status sources), or where the respectable mainstream consensus seems to clash with reality. When discussions about such topics are opened on LW, often the logical next step would be to ask about the more general underlying problems that give rise to these situations, instead of just focusing on the arguments about particular problems in isolation. (And even without a concrete motivation, such questions should directly follow from LW’s mission statement.) Yet I see few, if any attempts to ask such general questions on LW, and my occasional attempts to open discussion along these lines, even when highly upvoted, don’t elicit much in terms of interesting arguments and insight.
As an illustration, we can take an innocent and mainstream problematic topic like e.g. the health questions of lifestyle such as nutrition, exercise, etc. These topics have been discussed on LW many times, and it seems evident that the mainstream academic literature is a complete mess, with potential gems of useful insight buried under mountains of nonsense work, and authoritative statements of expert opinion given without proper justification. Yet I see no attempt to ask a straightforward follow-up question: since these areas operate under the official bureaucratic system that’s supposed to be guaranteed to produce valid science, then what exactly went wrong? And what implications does it have for other areas where we take the official output of this same bureaucratic system as ironclad evidence?
Of course, when it comes to topics that are more dangerous and ideologically charged, the underlying problems are likely to be different and more severe. One can reasonably argue that such topics are best avoided on a forum like LW, both because they’re likely to stir up bad blood and because of the potential bad signaling and reputational consequences for the forum as an institution. But even if we take the most restrictive attitude towards such topics, there are still many others that can be used as case studies for gaining insight about the systematic underlying problems.
Your own points have struck me as on the mark; but I haven’t had much to add.
There are some interesting general comments that I could make based on my experience in the mathematical community in particular. I guess here I have some tendency toward self-preservation myself; I don’t want to offend acquaintances who might be cast in negative life by my analysis. (Would be happy to share my views privately if you’re interested though.) I guess my attitude here is that there’s little upside to making my remarks public. The behaviors that I perceive to be dysfunctional are sufficiently deeply entrenched so that whatever I would say would have little expected value.
The main upside would be helping others attain intellectual enlightenment, but although I myself greatly enjoy the satisfaction of being intellectual enlightenment, I’m not sure that intellectual enlightenment is very valuable from a global perspective. Being right is of little use without being influential. In general the percentage of people who are right (or interested in being right) on a given topic where a contrarian position is right is sufficiently small so that the critical mass that it would take to change things isn’t there and nor would an incremental change in this percentage make a difference.
The reason why the above point has so much weight in my mind is that despite my very high interest in learning about a variety of things and in forming accurate views on a variety of subjects; I haven’t achieved very much. It’s not clear whether having accurate views of the world has been more helpful or harmful to me in achieving my goals. The jury is still very much out and things may change; but the very fact that it’s possible for me to have this attitude is a strong indication that knowledge and accurate views on a variety of things can be useless on their own.
Regarding:
I made a comment that you may find relevant here; I would characterize nutrition/exercise/etc. as fields that are obviously important and which therefore attract many researchers/corporations/hobbyists/etc. having the effect of driving high quality of researchers out of the field on account of bad associations.
Another factor may be absence of low hanging fruit (which you reference in your top level post); it could be that the diversity of humans is sufficiently great so that it’s difficult to make general statements about what’s healthy/unhealthy.
I agree with what you said about main stream fields being diluted, but offer an interesting corollary to that. Economic motives compel various gurus and nutritionists to make claims to the average joe, and the average joe, or even the educated joe cannot sort through them. However, if one looks in more narrow fields, one can obtain more specific answers without so much trash. For example, powerlifting. This is not a huge market nor one you can benefit financially from that much. If one is trying to sell something or get something published, he can’t just say “I pretty much agree with X”, he needs to somehow distinguish himself. But when that motive is eliminated you can get more consistency in recommendations and have a greater chance to actually hit upon what works.
While you might not be interested in powerlifting, reading in more niche areas can help filter out profit/status seeking charlatans, and can allow one to see the similarities across disciplines. So while I’ve read about bodybuilding, powerlifting, and endurance sports, and their associated nutritional advice, I would never read a book about “being fit.”
As an aside, I recently had this horrible moment of realization. Much of the fitness advise given out is just so incredibly wrong, and I am able to realize that because I have a strong background in that subject. But I realized, 90% of the stuff I read about are areas I don’t have a great background in. I could be accepting really wrong facts in other areas that are just as wrong as the nutritional facts I scoff at, and I would never learn of my error.
That does seem to be a useful heuristic. DOOM mongers are usually selling something. They typically make exaggerated and biased claims. The SIAI and FHI do not seem to be significant exceptions to this—though their attempts to be scientific and rational certainly help.
These types of organisation form naturaly from those with the largest p(DOOM) estimates. That is not necessarily the best way to obtain an unbiased estimate. If you run into organistions who are trying to convince you that the end of the world is nigh—and you should donate to help them save it—you should at least be aware that this pattern is an ancient one with a dubious pedigree.
I am inclined to ask for references. As far as I understand it there is a real science Cryogenics—which goes out of its way to distance itself from its more questionable cousin (cryonics) - which has a confusingly-similar name. Much as psychology tries to distinguish itself from psychiatry. Is there much more than that going on here?
From what I understand, the professional learned society of cryobiologists has an official policy that bans any engagement with cryonics to its members under the pain of expulsion (which penalty would presumably have disastrous career implications). Therefore, cryobiologists are officially mandated to uphold this party line and condemn cryonics, if they are to speak on the subject at all. From what I’ve seen, cryonics people have repeatedly challenged this position with reasonable arguments, but they haven’t received anything like satisfactory rebuttals that would justify the official position. (See more details in this post, whose author has spent considerable effort searching for such rebuttal.)
Now, for all I know, it may well be that the claims of cryonicists are complete bunk after all. The important point is that here we see a clear and unambiguous instance of the official academic mainstream upholding an official line that is impervious to rational argument, and attempts to challenge this official line elicit sneering and stonewalling rather than any valid response. One of my claims in this discussion is that this is far from being the only such example (although the official positions and the condemnations of dissenters are rarely spelled out so explicitly), and LW people familiar with this example should take it as a significant piece of evidence against trusting the academic mainstream consensus in general.
This seems to be the relevant bit:
It says they are not allowed to perform or promote freezing of “deceased persons”—citing concerns over ethical and scientific standards—and its own reputation. They probably want to avoid them and their members being associated with cryonics scandals and lawsuits.
As I said, I don’t have a dog in this particular fight, and for all I know, the cryobiologists’ rejection of cryonics might in fact be justified, for both reasons of science and pragmatist political considerations. However, the important point is that if you ask, in a polite, reasonable, and upfront manner, for a scientific assessment of cryonics and what exactly are the problems with it, it is not possible to get a full, honest, and scientifically sound answer, as demonstrated by that article to which I linked above. Contrast this with what happens if you ask, say, physicists what is wrong with some crackpot theory of physics—they will spell out a detailed argument showing what exactly is wrong, and they will be able to answer any further questions you might have and fully clarify any confusion, as long as you’re not being impervious to argument.
Regardless of any particular concern about cryonics, the conclusion to draw from this is that a strong mainstream academic consensus sometimes rests on a rock-solid foundation that can be readily examined if you just invest some effort, but sometimes this is not the case, at the very least because for some questions there is no way to even get a clear and detailed statement on what exactly this foundation is supposed to be. From this, it is reasonable to conclude that mainstream academic consensus should not be taken as conclusive evidence for anything—and in turn, contrarian opinions should not be automatically discarded just because mainstream academics reject them -- unless you have some reliable criteria for evaluating how solid its foundation is in a particular area. The case of cryonics is relevant for my argument only insofar as this is a question where lots of LW people have run into a strong mainstream consensus for which it’s impossible to get a solid justification, thus providing one concrete example that shouldn’t be too controversial here.
I think most parties involved agree that cryonic revival is a bit of long shot. It is hard to say exactly how much of a long shot it is—since that depends on speculative far-future things like whether an advanced civilization will be sufficiently interested enough in us to revive us. Scientists can’t say too much about that—except that there are quite a few unknown unknowns—and so to have wide confidence intervals.
definitely and thank you for sharing your list :)
This comment seems to be influenced by an association fallacy.
The fact that someone has suffered doesn’t imply that they are rational or irrational.
If you acknowledge evidence that someone is being irrational it doesn’t mean you have to deny they have any positive qualities or be unsympathetic about their problems.
I made no claim that they were rational—or that they had no nice qualities.
I was trying to point out that “I think that other people should not spend time on X because it is an unimportant topic to me” is failing to understand that to these people, X is not an unimportant topic.
“some examples would be nice”
Some examples would agitate a thundering herd of heretic hunters and witch finders who would vote down every post that Vladimir has ever made or will ever make.
This dosen’t seem overwhelmingly likley false. Its not a very nice things to say, but why should that matter?
Why so many down votes? In any case since I’m Charlie Sheen I don’t care if I’m down voted since I’m WINNING no matter what they say (wow memetic brainfreeze).
It does to me. Less Wrong isn’t so popular that I’d expect a herd of people to bother downvoting each of Vladimir_M’s dozens of posts, then wait around for him to make more posts to downvote, just because of a few examples. The fact that Vladimir_M gave two or three examples but still has non-negative scores for his recent comments is more evidence that sam0345 was wrong.
Quite a few LW users see niceness as a useful norm and may have voted accordingly. At any rate, I don’t think it was a lack of niceness that provoked people to downvote that comment; I’d guess it was because they read it as an unhelpful overstatement.
I’m inclined to agree, though I suspect we’ve got different lists of what the real problems are.
If you have more to say as to your list of what the real problems are I’d be interested in reading.
I believe that a lot of what’s wrong with the world comes from taking governments too seriously. The historical argument for atheism—the damage done by religion—applies at least as strongly to governments.
This doesn’t mean I think it necessarily makes sense for individuals to conspicuously ignore a government which is dangerous to them. To put it mildly, there are group effects.
Taking governments too seriously in what sense? Adopting values implicit in in government rhetoric? Following laws? Give some examples if you’d like.
Also, are you considering the counterfactual here? Without religion there’s atheism. What happens when people don’t take governments too seriously? It’s actually unclear to me that religion does more harm than good; I would guess that the harm done apparently done by religion is largely due to general human nature and that there are upsides of organic community so that on balance it’s a wash.
I noticed that, while there used to be religious wars, for the most part these days, what gets people to die for no good reason is nationalism.
I’m not sure what the best attitude is—I don’t think we can dispense with government these days, but on the other hand, I don’t think law-abidingness and patriotism should be put very high on the list of virtues.
Don’t both of religion and nationalism fall under the broader umbrella of tribalism? It’s plausible to me that without either one there would be some other sort of tribalism with adverse effects on global welfare. People might not die as a result but I don’t think that there’s reason to think that the aggregate negative effect would be smaller.
Now, if what replaced them was some sort of ideology of the type “equal consideration for all” that filled the vaccum left over by politics/religion then that would be different. I have little sense for how likely this would be.
What sort of law-abidingness do you have in mind here? Obeying military drafts?
It’s actually unclear to me that religion does more harm than good
For quick and dirty empirical evidence, look at latest european poll Do countries on the top of table with least belief in God, spirit or life force behave more rationally?
As someone who is both into the skeptics movement and the atheist movement i’m not sure what skeptics “wouldn’t dare mutter” about. It seems to me that skeptics and atheists just have and interest in those things and want to stop the harm caused by them.
Also, i must be ignorant about all these other horrible delusions you are talking about.
Further you must be talking about instrumental rationality because i’m not sure how this is evidence against epistemic rationality.
I may have been too harsh on the skeptics, some of whom occasionally do attack nonsense in a way that riles up not just crackpots, but also some highly respectable and even academically accredited ideologues and charlatans. However, the problem I see is the main thrust of the movement, which assumes that dangerous nonsense that should be attacked and debunked is practically always purveyed and followed by people outside the official, respectable, accredited mainstream, with the implicit assumption that the latter is maybe imperfect, but still without any deep and horrendous flaws, and in matters where it produces strong consensus, we have nothing much to worry about.
This is where my Titanic analogy comes in. When I read about skeptics debunking people like, say, Uri Geller or Erich von Daeniken, clearly I have no objection to the substance of their work—on the contrary. However, if such people are left unchecked, it’s not like they will tomorrow be awarded high places in the government and the academia, and be given the power to propagandize their views with high official authority, both in classrooms and in mass media that would cite them as authorities, to write laws and regulations based on their delusions, to promote (and aggressively impose) their views through international institutions and foreign policy, etc., etc., with all the disastrous consequences that may follow from that. Therefore, shouldn’t a rational person be more concerned with the possible delusions of people who do have such power and authority? They are the ones presently in charge of steering the ship, after all, and it’s not like there aren’t any icebergs around.
Of course, if you believe that the official institutions that produce academic consensus and respectable mainstream public opinion are generally OK and not causing any ongoing (or potential future) disasters, clearly these concerns are baseless. But are you really so sure that this optimism is based on a realistic appraisal of the situation?
As I mentioned elsewhere in this thread, I recommend this post by Quirinus_Quirrell. The list there is by no means comprehensive, but it should give you an idea of what people are talking about.
Edit: Two more good articles to read are this one by Paul Graham, and this post by Vladimir_M.