LW is nearly perfect but does lack self-criticism. I love self-criticism and I perceive too much agreement to be boring. One of the reasons why there is so much agreement here is not that there is nothing wrong but that people who strongly disagree either don’t bother or are deterred by the reputation system. How do I know that? The more I read the more I learn that a lot of the basic principles here are not as well-grounded as the commitment of the community would suggest. Recently I wrote various experts in an effort to approach some kind of ‘peer-review’ of LW. I got replies from people as diverse as Douglas Hofstadter, Greg Egan, Ben Goertzel, David Pearce, various economists, experts and influencer’s. The overall opinion so far is not so much in favor of this community. Regarding the reputation system? People told me that it is one of the reasons why they don’t bother to voice their opinion and lurk, but you could just read the infamous RationalWiki entry to get an idea of the general perception (although it improved since my comment here, which they pasted into the talk page). I tried a few times to question the reputation system here myself or ask if there are some posts or studies showing that such systems do subdue trolling but not at the price of truth and honesty, that reputation systems do not cause unjustified conformity. Sadly the response is often downvotes mixed with angry replies. Another problem is the obvious arrogance here which is getting more distinct all the time. There is an LW versus rest of the world attitude. There is LW and then there are the irrational, ordinary people. That’s just sad and I’m personally appalled by it.
Here is how some people described LW when I asked them about it:
...a lot of impressive-sounding jargon and slogans, and not everything they say is false and foolish, but in my view they’ve just sprinkled enough mathematics and logic over their fantasies to give them a veneer of respectability.
or
...they are naïve as far as the nature of human intelligence goes. I think they are mostly very bright and starry-eyed adults who never quite grew out of their science-fiction addiction as adolescents. None of them seems to have a realistic picture about the nature of thinking...
Even though I am basically the only person here who is often openly derogatory about this community, people seem to perceive it as too much already. I am apparently just talking about the same old problems over and over. Yet I’ve only been posting here since August 2010. The problems have not been fixed. There are problems like the increasing and unjustified arrogance, lack or criticism (let alone peer-review) and an general public relations problem (Scientology also gets donations ;-). But those problems don’t matter. What is wrong and what will probably never change is that mere ideas are sold as ‘laws’ which are taken seriously to a dangerous degree by some individuals here. This place is basically breeding the first group of rationalists committed to do everything in the name of expected utility. I think that is not only incredible scary but also causes distress in people who are susceptible to such thinking.
… this might be a formative early experience for someone who went on to make genuine contributions.
LW is certainly of great value and importance and I loved reading a lot of what has been written so far. I would never suggest that LW is junk but as long as it has the slightest problem with someone coming here and proclaiming that you are all wrong then something is indeed wrong.
Is there as much of a problem with the karma system as you make it out to be? I’ve posted comments critical of cryonics, comments critical of the idea of hard take off being likely, comments critical of Eliezer’s writing style, and comments critical of general LW understanding of history of science. Almost every such comment has been voted up(and I can point to individual comments in all those categories that have been voted up).
I suspect that quality threshold for critical comments being voted up is higher than that for non-critical comments, and that the threshold is similarly more strict for low quality comments being likely to be voted down. But, that’s a common problem, and in any event, high quality comments aren’t often voted down. So, I fail to see how anyone would be substantially discouraged from posting critical comments unless they just weren’t very familiar with the system here.
Yeah, this is my experience. I’ve posted lots of comments and even whole posts critical of Eliezer on this point or that point and have been upvoted heavily because I made my point and defended it well.
So I’m not sure the karma system makes it so you can’t voice contrarian opinions. The karma system seems to enforce the idea that you defend what you say competently.
Case in point: Mitchell’s heavily upvoted comment to which we are now responding.
It seems to me that the karma system needn’t foster any actual intolerance for dissent among voters for it to have a chilling effect on dissenting newcomers. If a skeptical newcomer encounters the site, reads a few dozen posts, and notices that posts concordant with community norms tend to get upvoted, while dissonant ones tend to get downvoted, then from that observer’s perspective the evidence indicates that voicing their skepticism would be taken poorly—even if in actuality the voting effects are caused by high-visibility concordant posts belonging to bright and well-spoken community members and high-visibility dissonant posts belonging to trolls or random crackpots (who in turn have incentives to ignore those same chilling effects).
Without getting rid of the karma system entirely, one possible defense against this sort of effect might be to encourage a community norm of devil’s advocacy. I see some possible coordination problems with that, though.
If the community norms are ones we don’t endorse, then sure, let’s overthrow those norms and replace them with norms we do endorse, in a targeted way. Which norms are we talking about, and what ought we replace them with?
Conversely, if we’re talking about all norms… that is, if we’re suggesting either that we endorse no norms at all, or that we somehow endorse a norm while at the same time avoiding discouraging contributions that violate that norm… I’m not sure that even makes sense. How is the result of that, even if we were successful, different from any other web forum?
I was trying to remain agnostic with regard to any specific norms. I’m not worried about particular values so much as the possibility of differentially discouraging sincere, well-informed dissent in newcomers relative to various forms of insincere or naive dissent: over time I’d expect that effect to isolate group opinion in ways which aren’t necessarily good for our collective sanity. This seems related to Eliezer’s evaporative cooling idea, except that it’s happening on recruitment—perhaps a semipermeable membrane would be a good analogy.
I tried a few times to question the reputation system here myself or ask if there are some posts or studies showing that such systems do subdue trolling but not at the price of truth and honesty, that reputation systems do not cause unjustified conformity. Sadly the response is often downvotes mixed with angry replies.
It would be nice if there were more studies about reputation systems. I think the anti-spam capability is pretty obvious, though.
We will be seeing more reputation systems in the future—it is pretty good that this site is trying one out, IMHO.
Is the group-think here worse than it was on OB or SL4? Not obviously. IMO, the group think (which, incidentally, I would agree is a systematic problem) is mostly down to the groupies, not the karma system.
I would never suggest that LW is junk but as long as it has the slightest problem with someone coming here and proclaiming that you are all wrong then something is indeed wrong.
Not really—the internet is full of nonsense—and sometimes it just needs ignoring.
If you read Ray Kurzweil’s books and Hans Moravec’s, what I find is that it’s a very bizarre mixture of ideas that are solid and good with ideas that are crazy. It’s as if you took a lot of very good food and some dog excrement and blended it all up so that you can’t possibly figure out what’s good or bad. It’s an intimate mixture of rubbish and good ideas, and it’s very hard to disentangle the two, because these are smart people; they’re not stupid.
Greg Egan wrote a book recently parodying the SIAI. David Pearce has some very different, but also prettty strange ideas of his own. So, maybe you are picking on dissenters here.
What is wrong and what will probably never change is that mere ideas are sold as ‘laws’ which are taken seriously to a dangerous degree by some individuals here. This place is basically breeding the first group of rationalists committed to do everything in the name of expected utility.
That is not obviously wrong. That’s just down to the model of a rational agent which is widely accepted around here. If you have objections in this area, I think they need references—or some more spelling out.
The only way this statement could be true is if the question you asked is so specifically qualified that it loses all meaning.
Also even asking that question is epistemically dangerous. Ask a specific meaningful question like “Does the LW community improve the career of college students in STEM majors”. Pose a query you can hug.
LW is nearly perfect but does lack self-criticism. I love self-criticism and I perceive too much agreement to be boring. One of the reasons why there is so much agreement here is not that there is nothing wrong but that people who strongly disagree either don’t bother or are deterred by the reputation system. How do I know that? The more I read the more I learn that a lot of the basic principles here are not as well-grounded as the commitment of the community would suggest. Recently I wrote various experts in an effort to approach some kind of ‘peer-review’ of LW. I got replies from people as diverse as Douglas Hofstadter, Greg Egan, Ben Goertzel, David Pearce, various economists, experts and influencer’s. The overall opinion so far is not so much in favor of this community. Regarding the reputation system? People told me that it is one of the reasons why they don’t bother to voice their opinion and lurk, but you could just read the infamous RationalWiki entry to get an idea of the general perception (although it improved since my comment here, which they pasted into the talk page). I tried a few times to question the reputation system here myself or ask if there are some posts or studies showing that such systems do subdue trolling but not at the price of truth and honesty, that reputation systems do not cause unjustified conformity. Sadly the response is often downvotes mixed with angry replies. Another problem is the obvious arrogance here which is getting more distinct all the time. There is an LW versus rest of the world attitude. There is LW and then there are the irrational, ordinary people. That’s just sad and I’m personally appalled by it.
Here is how some people described LW when I asked them about it:
or
Even though I am basically the only person here who is often openly derogatory about this community, people seem to perceive it as too much already. I am apparently just talking about the same old problems over and over. Yet I’ve only been posting here since August 2010. The problems have not been fixed. There are problems like the increasing and unjustified arrogance, lack or criticism (let alone peer-review) and an general public relations problem (Scientology also gets donations ;-). But those problems don’t matter. What is wrong and what will probably never change is that mere ideas are sold as ‘laws’ which are taken seriously to a dangerous degree by some individuals here. This place is basically breeding the first group of rationalists committed to do everything in the name of expected utility. I think that is not only incredible scary but also causes distress in people who are susceptible to such thinking.
LW is certainly of great value and importance and I loved reading a lot of what has been written so far. I would never suggest that LW is junk but as long as it has the slightest problem with someone coming here and proclaiming that you are all wrong then something is indeed wrong.
Is there as much of a problem with the karma system as you make it out to be? I’ve posted comments critical of cryonics, comments critical of the idea of hard take off being likely, comments critical of Eliezer’s writing style, and comments critical of general LW understanding of history of science. Almost every such comment has been voted up(and I can point to individual comments in all those categories that have been voted up).
I suspect that quality threshold for critical comments being voted up is higher than that for non-critical comments, and that the threshold is similarly more strict for low quality comments being likely to be voted down. But, that’s a common problem, and in any event, high quality comments aren’t often voted down. So, I fail to see how anyone would be substantially discouraged from posting critical comments unless they just weren’t very familiar with the system here.
Yeah, this is my experience. I’ve posted lots of comments and even whole posts critical of Eliezer on this point or that point and have been upvoted heavily because I made my point and defended it well.
So I’m not sure the karma system makes it so you can’t voice contrarian opinions. The karma system seems to enforce the idea that you defend what you say competently.
Case in point: Mitchell’s heavily upvoted comment to which we are now responding.
It seems to me that the karma system needn’t foster any actual intolerance for dissent among voters for it to have a chilling effect on dissenting newcomers. If a skeptical newcomer encounters the site, reads a few dozen posts, and notices that posts concordant with community norms tend to get upvoted, while dissonant ones tend to get downvoted, then from that observer’s perspective the evidence indicates that voicing their skepticism would be taken poorly—even if in actuality the voting effects are caused by high-visibility concordant posts belonging to bright and well-spoken community members and high-visibility dissonant posts belonging to trolls or random crackpots (who in turn have incentives to ignore those same chilling effects).
Without getting rid of the karma system entirely, one possible defense against this sort of effect might be to encourage a community norm of devil’s advocacy. I see some possible coordination problems with that, though.
If the community norms are ones we don’t endorse, then sure, let’s overthrow those norms and replace them with norms we do endorse, in a targeted way. Which norms are we talking about, and what ought we replace them with?
Conversely, if we’re talking about all norms… that is, if we’re suggesting either that we endorse no norms at all, or that we somehow endorse a norm while at the same time avoiding discouraging contributions that violate that norm… I’m not sure that even makes sense. How is the result of that, even if we were successful, different from any other web forum?
I was trying to remain agnostic with regard to any specific norms. I’m not worried about particular values so much as the possibility of differentially discouraging sincere, well-informed dissent in newcomers relative to various forms of insincere or naive dissent: over time I’d expect that effect to isolate group opinion in ways which aren’t necessarily good for our collective sanity. This seems related to Eliezer’s evaporative cooling idea, except that it’s happening on recruitment—perhaps a semipermeable membrane would be a good analogy.
It would be nice if there were more studies about reputation systems. I think the anti-spam capability is pretty obvious, though.
We will be seeing more reputation systems in the future—it is pretty good that this site is trying one out, IMHO.
Is the group-think here worse than it was on OB or SL4? Not obviously. IMO, the group think (which, incidentally, I would agree is a systematic problem) is mostly down to the groupies, not the karma system.
Not really—the internet is full of nonsense—and sometimes it just needs ignoring.
Cool! You posted some material from Ben—but it would be interesting to hear more.
Ben made some critical comments recently. Douglas Hofstadter has long been a naysayer of the whole area:
Greg Egan wrote a book recently parodying the SIAI. David Pearce has some very different, but also prettty strange ideas of his own. So, maybe you are picking on dissenters here.
That is not obviously wrong. That’s just down to the model of a rational agent which is widely accepted around here. If you have objections in this area, I think they need references—or some more spelling out.
Seriously?
The only way this statement could be true is if the question you asked is so specifically qualified that it loses all meaning.
Also even asking that question is epistemically dangerous. Ask a specific meaningful question like “Does the LW community improve the career of college students in STEM majors”. Pose a query you can hug.