I occasionally ponder what LW’s objective place in the scheme of things might be. Will it ever matter as much as, say, the Vienna Circle? Or even just as much as the Futurians? - who didn’t matter very much, but whose story should interest the NYC group. The Futurians were communists, but that was actually a common outlook for “rationalists” at the time, and the Futurians were definitely future-oriented.
Will LW just become a tiresome and insignificant rationalist cult? The more that people want to conduct missionary activity, “raising the sanity waterline” and so forth, the more that this threatens to occur. Rationalist evangelism from LW might take two forms, boring and familiar, or eccentric and cultish. The boring and familiar form of rationalist evangelism could encompass opposition to religion, psych 101 lectures about cognitive bias, and tips on how optimism and clear thinking can lead to success in mating and moneymaking. An eccentric and cultish form of rationalist evangelism could be achieved by combining cryonics boosterism, Bayes-worship, insistence that the many-worlds interpretation is the only rational interpretation of quantum mechanics, and the supreme importance of finding the one true AI utility function.
It could be that the dominant intellectual and personality tendencies here—critical and analytical—will prevent serious evangelism of either type from ever getting underway. So let’s return for a moment to the example of the Vienna Circle, which was not much of a missionary outfit. It produced a philosophy, logical positivism, which was influential for a while, and it was a forum in which minds like Godel and Wittgenstein (and others who are much lesser known now, like Otto Neurath) got to trade views with other people who were smart and on their wavelength, though of course they had their differences.
Frankly I think it is unlikely that LW will reach that level. The Vienna Circle was a talking shop, an intellectual salon, but it was perhaps one in ten thousand in terms of its lucidity and significance. Recorded and unrecorded history, and the Internet today, is full of occasions where people met, were intellectually sympatico, and managed to elaborate their worldview in a way they found satisfactory; and quite often, the participants in this process felt they were doing something more than just personally exciting—they thought they were finding the truth, getting it right where almost everyone else got it wrong.
I appreciate that quite a few LW contributors will be thinking, I’m not in this out of a belief that we’re making history; it’s paying dividends for me and my peers, and that’s good enough. But you can’t deny that there is a current here, a persistent thread of opinion, which believes that LW is extremely important or potentially so, that it is a unique source of insights, a workshop for genuine discovery, an oasis of truth in a blind or ignorant world, etc.
Some of that perception I believe is definitely illusory, and comes from autodidacts thinking they are polymaths. That is, people who have developed a simple working framework for many fields or many questions of interest, and who then mistake that for genuine knowledge or expertise. When this illusion becomes a collective one, that is when you get true intellectual cultism, e.g. the followers of Lyndon Larouche. Larouche has an opinion on everything, and so to those who believe him on everything, he is the greatest genius of the age.
Then, there are some intellectual tendencies here which, if not entirely unique to LW, seem to be expressed with greater strength, diversity, and elaboration than elsewhere. I’m especially thinking of all the strange new views, expressed almost daily, about identity, morality, reality, arising from extreme multiverse thinking, computational platonism, the expectation of uploads… That is an area where I think LW would unquestionably be of interest to a historian of technological subcultural belief. And I think it’s very possible that some form of these ideas will give rise to mass belief systems later in this century—people who don’t worry about death because they believe in quantum immortality, popular ethical movements based on some of the more extreme or bizarre conclusions being deduced from radical utilitarianism, Singularity debates becoming an element of political life. I’m not saying LW would be the source of all this, just that it might be a bellwether of an emerging zeitgeist in which the ambient technical and cultural environment naturally gives rise to such thinking.
But is there anything happening here which will contribute to intellectual progress? - that’s my main question right now. I see two ways that the answer might be yes. First, the ideas produced here might actually be intellectual progress; second, this might be a formative early experience for someone who went on to make genuine contributions. I think it’s likely that the second option will be true of someone—that at least one, and maybe several people, who are contributing to this site or just reading it, will, years from now, be making discoveries, in psychology or in some field that doesn’t yet exist, and it will be because this site warped their sensibility (or straightened it). But for now, my question is the first one: is there any intellectual progress directly occurring here, of a sort that would show up in a later history of ideas? Or is this all fundamentally, at best, just a learning experience for the participants, of purely private and local significance?
LW is nearly perfect but does lack self-criticism. I love self-criticism and I perceive too much agreement to be boring. One of the reasons why there is so much agreement here is not that there is nothing wrong but that people who strongly disagree either don’t bother or are deterred by the reputation system. How do I know that? The more I read the more I learn that a lot of the basic principles here are not as well-grounded as the commitment of the community would suggest. Recently I wrote various experts in an effort to approach some kind of ‘peer-review’ of LW. I got replies from people as diverse as Douglas Hofstadter, Greg Egan, Ben Goertzel, David Pearce, various economists, experts and influencer’s. The overall opinion so far is not so much in favor of this community. Regarding the reputation system? People told me that it is one of the reasons why they don’t bother to voice their opinion and lurk, but you could just read the infamous RationalWiki entry to get an idea of the general perception (although it improved since my comment here, which they pasted into the talk page). I tried a few times to question the reputation system here myself or ask if there are some posts or studies showing that such systems do subdue trolling but not at the price of truth and honesty, that reputation systems do not cause unjustified conformity. Sadly the response is often downvotes mixed with angry replies. Another problem is the obvious arrogance here which is getting more distinct all the time. There is an LW versus rest of the world attitude. There is LW and then there are the irrational, ordinary people. That’s just sad and I’m personally appalled by it.
Here is how some people described LW when I asked them about it:
...a lot of impressive-sounding jargon and slogans, and not everything they say is false and foolish, but in my view they’ve just sprinkled enough mathematics and logic over their fantasies to give them a veneer of respectability.
or
...they are naïve as far as the nature of human intelligence goes. I think they are mostly very bright and starry-eyed adults who never quite grew out of their science-fiction addiction as adolescents. None of them seems to have a realistic picture about the nature of thinking...
Even though I am basically the only person here who is often openly derogatory about this community, people seem to perceive it as too much already. I am apparently just talking about the same old problems over and over. Yet I’ve only been posting here since August 2010. The problems have not been fixed. There are problems like the increasing and unjustified arrogance, lack or criticism (let alone peer-review) and an general public relations problem (Scientology also gets donations ;-). But those problems don’t matter. What is wrong and what will probably never change is that mere ideas are sold as ‘laws’ which are taken seriously to a dangerous degree by some individuals here. This place is basically breeding the first group of rationalists committed to do everything in the name of expected utility. I think that is not only incredible scary but also causes distress in people who are susceptible to such thinking.
… this might be a formative early experience for someone who went on to make genuine contributions.
LW is certainly of great value and importance and I loved reading a lot of what has been written so far. I would never suggest that LW is junk but as long as it has the slightest problem with someone coming here and proclaiming that you are all wrong then something is indeed wrong.
Is there as much of a problem with the karma system as you make it out to be? I’ve posted comments critical of cryonics, comments critical of the idea of hard take off being likely, comments critical of Eliezer’s writing style, and comments critical of general LW understanding of history of science. Almost every such comment has been voted up(and I can point to individual comments in all those categories that have been voted up).
I suspect that quality threshold for critical comments being voted up is higher than that for non-critical comments, and that the threshold is similarly more strict for low quality comments being likely to be voted down. But, that’s a common problem, and in any event, high quality comments aren’t often voted down. So, I fail to see how anyone would be substantially discouraged from posting critical comments unless they just weren’t very familiar with the system here.
Yeah, this is my experience. I’ve posted lots of comments and even whole posts critical of Eliezer on this point or that point and have been upvoted heavily because I made my point and defended it well.
So I’m not sure the karma system makes it so you can’t voice contrarian opinions. The karma system seems to enforce the idea that you defend what you say competently.
Case in point: Mitchell’s heavily upvoted comment to which we are now responding.
It seems to me that the karma system needn’t foster any actual intolerance for dissent among voters for it to have a chilling effect on dissenting newcomers. If a skeptical newcomer encounters the site, reads a few dozen posts, and notices that posts concordant with community norms tend to get upvoted, while dissonant ones tend to get downvoted, then from that observer’s perspective the evidence indicates that voicing their skepticism would be taken poorly—even if in actuality the voting effects are caused by high-visibility concordant posts belonging to bright and well-spoken community members and high-visibility dissonant posts belonging to trolls or random crackpots (who in turn have incentives to ignore those same chilling effects).
Without getting rid of the karma system entirely, one possible defense against this sort of effect might be to encourage a community norm of devil’s advocacy. I see some possible coordination problems with that, though.
If the community norms are ones we don’t endorse, then sure, let’s overthrow those norms and replace them with norms we do endorse, in a targeted way. Which norms are we talking about, and what ought we replace them with?
Conversely, if we’re talking about all norms… that is, if we’re suggesting either that we endorse no norms at all, or that we somehow endorse a norm while at the same time avoiding discouraging contributions that violate that norm… I’m not sure that even makes sense. How is the result of that, even if we were successful, different from any other web forum?
I was trying to remain agnostic with regard to any specific norms. I’m not worried about particular values so much as the possibility of differentially discouraging sincere, well-informed dissent in newcomers relative to various forms of insincere or naive dissent: over time I’d expect that effect to isolate group opinion in ways which aren’t necessarily good for our collective sanity. This seems related to Eliezer’s evaporative cooling idea, except that it’s happening on recruitment—perhaps a semipermeable membrane would be a good analogy.
I tried a few times to question the reputation system here myself or ask if there are some posts or studies showing that such systems do subdue trolling but not at the price of truth and honesty, that reputation systems do not cause unjustified conformity. Sadly the response is often downvotes mixed with angry replies.
It would be nice if there were more studies about reputation systems. I think the anti-spam capability is pretty obvious, though.
We will be seeing more reputation systems in the future—it is pretty good that this site is trying one out, IMHO.
Is the group-think here worse than it was on OB or SL4? Not obviously. IMO, the group think (which, incidentally, I would agree is a systematic problem) is mostly down to the groupies, not the karma system.
I would never suggest that LW is junk but as long as it has the slightest problem with someone coming here and proclaiming that you are all wrong then something is indeed wrong.
Not really—the internet is full of nonsense—and sometimes it just needs ignoring.
If you read Ray Kurzweil’s books and Hans Moravec’s, what I find is that it’s a very bizarre mixture of ideas that are solid and good with ideas that are crazy. It’s as if you took a lot of very good food and some dog excrement and blended it all up so that you can’t possibly figure out what’s good or bad. It’s an intimate mixture of rubbish and good ideas, and it’s very hard to disentangle the two, because these are smart people; they’re not stupid.
Greg Egan wrote a book recently parodying the SIAI. David Pearce has some very different, but also prettty strange ideas of his own. So, maybe you are picking on dissenters here.
What is wrong and what will probably never change is that mere ideas are sold as ‘laws’ which are taken seriously to a dangerous degree by some individuals here. This place is basically breeding the first group of rationalists committed to do everything in the name of expected utility.
That is not obviously wrong. That’s just down to the model of a rational agent which is widely accepted around here. If you have objections in this area, I think they need references—or some more spelling out.
The only way this statement could be true is if the question you asked is so specifically qualified that it loses all meaning.
Also even asking that question is epistemically dangerous. Ask a specific meaningful question like “Does the LW community improve the career of college students in STEM majors”. Pose a query you can hug.
The cults are always wrong, and not just a little wrong, but systematically wrong, with whole anti-epistemologies developing to bear the load. LW rationality activism can be eccentric, in the sense of paying attention to things that most folk don’t see as particularly interesting or important, but it’s interesting where a social movement would go, whose primary directive is to actually protect itself from error (instead of just keeping up the appearances) and to seek better methods for doing so. This is almost unexplored territory, if you don’t count global scientific community as a preceding example (one point, hard to draw generalizations from).
I believe we are significantly below a critical mass where the community becomes unlikely to fizzle out into obscurity, but given the non-arbitrary focus of the activism, it’s possible that the movement will persist, or get reincarnated elsewhere.
Rationalist evangelism from LW might take two forms, boring and familiar, or eccentric and cultish. The boring and familiar form of rationalist evangelism could encompass opposition to religion, psych 101 lectures about cognitive bias, and tips on how optimism and clear thinking can lead to success in mating and moneymaking. An eccentric and cultish form of rationalist evangelism could be achieved by combining cryonics boosterism, Bayes-worship, insistence that the many-worlds interpretation is the only rational interpretation of quantum mechanics, and the supreme importance of finding the one true AI utility function.
I disagree with the connotation that the latter constitute a “dark side” of Less Wrong and that if you take them seriously and try to persuade people of them you’re being cultish.
Nevertheless, it’s very easy to be perceived as such. I try hard not to be cultish, and I think succeed. But I fail hard at not sounding cultish.
It doesn’t even take cryonics. Even talking about atheism with my (agnostic) Mom was enough. Stating “I don’t believe in God” was acceptable but “God doesn’t exist, and those who believe it does are mistaken” was cultish. Such a level of confidence simply isn’t allowed. And when I tried to talk about immortality and uploading, she was convinced I had a problem.
People will see the “dark side” in Less Wrong even though there is none.
I think it’s likely that the second option will be true of someone—that at least one, and maybe several people, who are contributing to this site or just reading it, will, years from now, be making discoveries, in psychology or in some field that doesn’t yet exist, and it will be because this site warped their sensibility (or straightened it).
But it is also likely that there is someone out there who will be effected negatively by this site. Your statement is only slightly relevant to the question of whether LessWrong is overall a positive influence. In other words, it’s rhetorical dark arts.
But it is also likely that there is someone out there who will be effected negatively by this site.
You mean, someone who would have made a positive contribution, but they were intellectually (or otherwise) sidetracked by what they read here? That hadn’t occurred to me. (Which is why my statement wasn’t “rhetorical dark arts”—I genuinely didn’t think of that possibility; I only thought in terms of “LW leads somewhere” or “LW leads nowhere”.)
I enjoyed reading this post, in no small part due to the narcissistic pleasure of discussing a small community I am (to some degree) a part of. If there was some option to split a comment into a thread it seems like an ideal use would be found here.
At the very least, lesswrong provides a fairly high quality forum for discussion of topics appealing to nerdy types. Similar communities include stackexchange and math/science blogs, which are ‘harder’ than lesswrong; sites like reddit and xkcd forums tend to be on the ‘softer’ side of the spectrum. Lesswrong, so far, is the best open forum for the discussion of ‘futurist’ issues.
What lesswrong lacks in comparison to ‘harder’ sites is a broad base of subject specialists. The scope of discussion on lesswrong is quite broad; however, the base of expertise on lesswrong is very narrow as it consists mainly of SIAI members. It would be hard to argue that lesswrong would not benefit from the active participation of more experts from domains relevant to its interests: economics, psychology, computer science, mathematics, statistics. However, up to now, LW has attracted few such users, due perhaps to its low profile and the fact that the core members of the community do not seem to prioritize subject expertise. Yet until LW has that kind of userbase, it seems unlikely any high impact developments will arise from activity on LW (excluding efforts from SIAI members themselves.) In contrast, mathoverflow.net seems like precisely the recipe for combining expertise through the internet for the advancement of the sciences.
Perhaps what is more important is the emergence of the lesswrong “rationalist” subculture itself. Future historians might lump this subculture with the larger “atheism” subculture, which has much in common with the LW community in terms of demographic composition. What would be much more interesting is if the LW community grew to incorporate a much more demographically diverse userbase.
I would say lesswrong has had a moderate impact on my intellectual development since I started reading it as a college student. It was satisfying to see that others (such as Yudkowsky) were able to notice “what was wrong with philosophy” and in fact, this allowed me to divert my attention to preparing for statistics; on the whole, I probably would have spent more time arguing about various issues on the internet if Yudkowsky had not already argued those points (and probably much better than I could have.) Lesswrong/OB did not alert me to concerns about artificial intelligence (I was already thinking about them before encountering this site) and so far it has not succeeded in dissuading me from intending to do research that may contribute to the eventual development of artificial general intelligence.
I occasionally ponder what LW’s objective place in the scheme of things might be. Will it ever matter as much as, say, the Vienna Circle? Or even just as much as the Futurians? - who didn’t matter very much, but whose story should interest the NYC group. The Futurians were communists, but that was actually a common outlook for “rationalists” at the time, and the Futurians were definitely future-oriented.
Will LW just become a tiresome and insignificant rationalist cult? The more that people want to conduct missionary activity, “raising the sanity waterline” and so forth, the more that this threatens to occur. Rationalist evangelism from LW might take two forms, boring and familiar, or eccentric and cultish. The boring and familiar form of rationalist evangelism could encompass opposition to religion, psych 101 lectures about cognitive bias, and tips on how optimism and clear thinking can lead to success in mating and moneymaking. An eccentric and cultish form of rationalist evangelism could be achieved by combining cryonics boosterism, Bayes-worship, insistence that the many-worlds interpretation is the only rational interpretation of quantum mechanics, and the supreme importance of finding the one true AI utility function.
It could be that the dominant intellectual and personality tendencies here—critical and analytical—will prevent serious evangelism of either type from ever getting underway. So let’s return for a moment to the example of the Vienna Circle, which was not much of a missionary outfit. It produced a philosophy, logical positivism, which was influential for a while, and it was a forum in which minds like Godel and Wittgenstein (and others who are much lesser known now, like Otto Neurath) got to trade views with other people who were smart and on their wavelength, though of course they had their differences.
Frankly I think it is unlikely that LW will reach that level. The Vienna Circle was a talking shop, an intellectual salon, but it was perhaps one in ten thousand in terms of its lucidity and significance. Recorded and unrecorded history, and the Internet today, is full of occasions where people met, were intellectually sympatico, and managed to elaborate their worldview in a way they found satisfactory; and quite often, the participants in this process felt they were doing something more than just personally exciting—they thought they were finding the truth, getting it right where almost everyone else got it wrong.
I appreciate that quite a few LW contributors will be thinking, I’m not in this out of a belief that we’re making history; it’s paying dividends for me and my peers, and that’s good enough. But you can’t deny that there is a current here, a persistent thread of opinion, which believes that LW is extremely important or potentially so, that it is a unique source of insights, a workshop for genuine discovery, an oasis of truth in a blind or ignorant world, etc.
Some of that perception I believe is definitely illusory, and comes from autodidacts thinking they are polymaths. That is, people who have developed a simple working framework for many fields or many questions of interest, and who then mistake that for genuine knowledge or expertise. When this illusion becomes a collective one, that is when you get true intellectual cultism, e.g. the followers of Lyndon Larouche. Larouche has an opinion on everything, and so to those who believe him on everything, he is the greatest genius of the age.
Then, there are some intellectual tendencies here which, if not entirely unique to LW, seem to be expressed with greater strength, diversity, and elaboration than elsewhere. I’m especially thinking of all the strange new views, expressed almost daily, about identity, morality, reality, arising from extreme multiverse thinking, computational platonism, the expectation of uploads… That is an area where I think LW would unquestionably be of interest to a historian of technological subcultural belief. And I think it’s very possible that some form of these ideas will give rise to mass belief systems later in this century—people who don’t worry about death because they believe in quantum immortality, popular ethical movements based on some of the more extreme or bizarre conclusions being deduced from radical utilitarianism, Singularity debates becoming an element of political life. I’m not saying LW would be the source of all this, just that it might be a bellwether of an emerging zeitgeist in which the ambient technical and cultural environment naturally gives rise to such thinking.
But is there anything happening here which will contribute to intellectual progress? - that’s my main question right now. I see two ways that the answer might be yes. First, the ideas produced here might actually be intellectual progress; second, this might be a formative early experience for someone who went on to make genuine contributions. I think it’s likely that the second option will be true of someone—that at least one, and maybe several people, who are contributing to this site or just reading it, will, years from now, be making discoveries, in psychology or in some field that doesn’t yet exist, and it will be because this site warped their sensibility (or straightened it). But for now, my question is the first one: is there any intellectual progress directly occurring here, of a sort that would show up in a later history of ideas? Or is this all fundamentally, at best, just a learning experience for the participants, of purely private and local significance?
LW is nearly perfect but does lack self-criticism. I love self-criticism and I perceive too much agreement to be boring. One of the reasons why there is so much agreement here is not that there is nothing wrong but that people who strongly disagree either don’t bother or are deterred by the reputation system. How do I know that? The more I read the more I learn that a lot of the basic principles here are not as well-grounded as the commitment of the community would suggest. Recently I wrote various experts in an effort to approach some kind of ‘peer-review’ of LW. I got replies from people as diverse as Douglas Hofstadter, Greg Egan, Ben Goertzel, David Pearce, various economists, experts and influencer’s. The overall opinion so far is not so much in favor of this community. Regarding the reputation system? People told me that it is one of the reasons why they don’t bother to voice their opinion and lurk, but you could just read the infamous RationalWiki entry to get an idea of the general perception (although it improved since my comment here, which they pasted into the talk page). I tried a few times to question the reputation system here myself or ask if there are some posts or studies showing that such systems do subdue trolling but not at the price of truth and honesty, that reputation systems do not cause unjustified conformity. Sadly the response is often downvotes mixed with angry replies. Another problem is the obvious arrogance here which is getting more distinct all the time. There is an LW versus rest of the world attitude. There is LW and then there are the irrational, ordinary people. That’s just sad and I’m personally appalled by it.
Here is how some people described LW when I asked them about it:
or
Even though I am basically the only person here who is often openly derogatory about this community, people seem to perceive it as too much already. I am apparently just talking about the same old problems over and over. Yet I’ve only been posting here since August 2010. The problems have not been fixed. There are problems like the increasing and unjustified arrogance, lack or criticism (let alone peer-review) and an general public relations problem (Scientology also gets donations ;-). But those problems don’t matter. What is wrong and what will probably never change is that mere ideas are sold as ‘laws’ which are taken seriously to a dangerous degree by some individuals here. This place is basically breeding the first group of rationalists committed to do everything in the name of expected utility. I think that is not only incredible scary but also causes distress in people who are susceptible to such thinking.
LW is certainly of great value and importance and I loved reading a lot of what has been written so far. I would never suggest that LW is junk but as long as it has the slightest problem with someone coming here and proclaiming that you are all wrong then something is indeed wrong.
Is there as much of a problem with the karma system as you make it out to be? I’ve posted comments critical of cryonics, comments critical of the idea of hard take off being likely, comments critical of Eliezer’s writing style, and comments critical of general LW understanding of history of science. Almost every such comment has been voted up(and I can point to individual comments in all those categories that have been voted up).
I suspect that quality threshold for critical comments being voted up is higher than that for non-critical comments, and that the threshold is similarly more strict for low quality comments being likely to be voted down. But, that’s a common problem, and in any event, high quality comments aren’t often voted down. So, I fail to see how anyone would be substantially discouraged from posting critical comments unless they just weren’t very familiar with the system here.
Yeah, this is my experience. I’ve posted lots of comments and even whole posts critical of Eliezer on this point or that point and have been upvoted heavily because I made my point and defended it well.
So I’m not sure the karma system makes it so you can’t voice contrarian opinions. The karma system seems to enforce the idea that you defend what you say competently.
Case in point: Mitchell’s heavily upvoted comment to which we are now responding.
It seems to me that the karma system needn’t foster any actual intolerance for dissent among voters for it to have a chilling effect on dissenting newcomers. If a skeptical newcomer encounters the site, reads a few dozen posts, and notices that posts concordant with community norms tend to get upvoted, while dissonant ones tend to get downvoted, then from that observer’s perspective the evidence indicates that voicing their skepticism would be taken poorly—even if in actuality the voting effects are caused by high-visibility concordant posts belonging to bright and well-spoken community members and high-visibility dissonant posts belonging to trolls or random crackpots (who in turn have incentives to ignore those same chilling effects).
Without getting rid of the karma system entirely, one possible defense against this sort of effect might be to encourage a community norm of devil’s advocacy. I see some possible coordination problems with that, though.
If the community norms are ones we don’t endorse, then sure, let’s overthrow those norms and replace them with norms we do endorse, in a targeted way. Which norms are we talking about, and what ought we replace them with?
Conversely, if we’re talking about all norms… that is, if we’re suggesting either that we endorse no norms at all, or that we somehow endorse a norm while at the same time avoiding discouraging contributions that violate that norm… I’m not sure that even makes sense. How is the result of that, even if we were successful, different from any other web forum?
I was trying to remain agnostic with regard to any specific norms. I’m not worried about particular values so much as the possibility of differentially discouraging sincere, well-informed dissent in newcomers relative to various forms of insincere or naive dissent: over time I’d expect that effect to isolate group opinion in ways which aren’t necessarily good for our collective sanity. This seems related to Eliezer’s evaporative cooling idea, except that it’s happening on recruitment—perhaps a semipermeable membrane would be a good analogy.
It would be nice if there were more studies about reputation systems. I think the anti-spam capability is pretty obvious, though.
We will be seeing more reputation systems in the future—it is pretty good that this site is trying one out, IMHO.
Is the group-think here worse than it was on OB or SL4? Not obviously. IMO, the group think (which, incidentally, I would agree is a systematic problem) is mostly down to the groupies, not the karma system.
Not really—the internet is full of nonsense—and sometimes it just needs ignoring.
Cool! You posted some material from Ben—but it would be interesting to hear more.
Ben made some critical comments recently. Douglas Hofstadter has long been a naysayer of the whole area:
Greg Egan wrote a book recently parodying the SIAI. David Pearce has some very different, but also prettty strange ideas of his own. So, maybe you are picking on dissenters here.
That is not obviously wrong. That’s just down to the model of a rational agent which is widely accepted around here. If you have objections in this area, I think they need references—or some more spelling out.
Seriously?
The only way this statement could be true is if the question you asked is so specifically qualified that it loses all meaning.
Also even asking that question is epistemically dangerous. Ask a specific meaningful question like “Does the LW community improve the career of college students in STEM majors”. Pose a query you can hug.
The cults are always wrong, and not just a little wrong, but systematically wrong, with whole anti-epistemologies developing to bear the load. LW rationality activism can be eccentric, in the sense of paying attention to things that most folk don’t see as particularly interesting or important, but it’s interesting where a social movement would go, whose primary directive is to actually protect itself from error (instead of just keeping up the appearances) and to seek better methods for doing so. This is almost unexplored territory, if you don’t count global scientific community as a preceding example (one point, hard to draw generalizations from).
I believe we are significantly below a critical mass where the community becomes unlikely to fizzle out into obscurity, but given the non-arbitrary focus of the activism, it’s possible that the movement will persist, or get reincarnated elsewhere.
I disagree with the connotation that the latter constitute a “dark side” of Less Wrong and that if you take them seriously and try to persuade people of them you’re being cultish.
Nevertheless, it’s very easy to be perceived as such. I try hard not to be cultish, and I think succeed. But I fail hard at not sounding cultish.
It doesn’t even take cryonics. Even talking about atheism with my (agnostic) Mom was enough. Stating “I don’t believe in God” was acceptable but “God doesn’t exist, and those who believe it does are mistaken” was cultish. Such a level of confidence simply isn’t allowed. And when I tried to talk about immortality and uploading, she was convinced I had a problem.
People will see the “dark side” in Less Wrong even though there is none.
But it is also likely that there is someone out there who will be effected negatively by this site. Your statement is only slightly relevant to the question of whether LessWrong is overall a positive influence. In other words, it’s rhetorical dark arts.
You mean, someone who would have made a positive contribution, but they were intellectually (or otherwise) sidetracked by what they read here? That hadn’t occurred to me. (Which is why my statement wasn’t “rhetorical dark arts”—I genuinely didn’t think of that possibility; I only thought in terms of “LW leads somewhere” or “LW leads nowhere”.)
I apologize, I did not mean to give the connotation of malice on your part, merely danger for readers.
I enjoyed reading this post, in no small part due to the narcissistic pleasure of discussing a small community I am (to some degree) a part of. If there was some option to split a comment into a thread it seems like an ideal use would be found here.
At the very least, lesswrong provides a fairly high quality forum for discussion of topics appealing to nerdy types. Similar communities include stackexchange and math/science blogs, which are ‘harder’ than lesswrong; sites like reddit and xkcd forums tend to be on the ‘softer’ side of the spectrum. Lesswrong, so far, is the best open forum for the discussion of ‘futurist’ issues.
What lesswrong lacks in comparison to ‘harder’ sites is a broad base of subject specialists. The scope of discussion on lesswrong is quite broad; however, the base of expertise on lesswrong is very narrow as it consists mainly of SIAI members. It would be hard to argue that lesswrong would not benefit from the active participation of more experts from domains relevant to its interests: economics, psychology, computer science, mathematics, statistics. However, up to now, LW has attracted few such users, due perhaps to its low profile and the fact that the core members of the community do not seem to prioritize subject expertise. Yet until LW has that kind of userbase, it seems unlikely any high impact developments will arise from activity on LW (excluding efforts from SIAI members themselves.) In contrast, mathoverflow.net seems like precisely the recipe for combining expertise through the internet for the advancement of the sciences.
Perhaps what is more important is the emergence of the lesswrong “rationalist” subculture itself. Future historians might lump this subculture with the larger “atheism” subculture, which has much in common with the LW community in terms of demographic composition. What would be much more interesting is if the LW community grew to incorporate a much more demographically diverse userbase.
I would say lesswrong has had a moderate impact on my intellectual development since I started reading it as a college student. It was satisfying to see that others (such as Yudkowsky) were able to notice “what was wrong with philosophy” and in fact, this allowed me to divert my attention to preparing for statistics; on the whole, I probably would have spent more time arguing about various issues on the internet if Yudkowsky had not already argued those points (and probably much better than I could have.) Lesswrong/OB did not alert me to concerns about artificial intelligence (I was already thinking about them before encountering this site) and so far it has not succeeded in dissuading me from intending to do research that may contribute to the eventual development of artificial general intelligence.