Look, Rain, this is an Internet ongoing discussion. Nobody says everything precisely right. The point is that you would hardly be so severe on someone unless you disagreed strongly. You couldn’t be, because nobody would satisfy your accuracy demands. The kind of nitpicking you engage in your post would ordinarily lead you to be downvoted—and you should be, although I won’t commit the rudeness of so doing in a discussion.
The point wasn’t that you downvote when the only thing wrong with the comment is discussion of karma. It was that you treat discussion of karma as an unconditional wrong. So you exploited weaknesses in my phrasing to ignore what I think was obviously the point—that marking down for the bare mention of karma (even if it doesn’t produce a downvote in each case) is an irrational policy, when karma is at the heart of the discussion. There’s no rational basis for throwing it in as an extra negative when the facts aren’t right.
You’re looking for trivial points to pick to downvote and to ignore the main point, which was your counting mention of karma a negative, without regard to the subject, is an irrational policy. If we were on reversed sides, your nitpicking and evasion would itself be marked down. As matters stand, you don’t even realize you’re acting in a biased fashion, and readers either don’t know or don’t care.
Is that rational? Shouldn’t a rationalist community be more concerned with criticizing irrationalities in its own process?
Having been a subject of both a relatively large upvote and a relatively large downvote in the last couple of weeks, I still think that the worst thing one can do is to complain about censorship or karma. The posts and comments on any forum aren’t judged on their “objective merits” (because there is no such thing), but on its suitability for the forum in question. If you have been downvoted, your post deserves it by definition. You can politely inquire about the reasons, but people are not required to explain themselves. As for rationality, I question whether it is rational to post on a forum if you are not having fun there. Take it easy.
The posts and comments on any forum aren’t judged on their “objective merits” (because there is no such thing), but on its suitability for the forum in question. If you have been downvoted, your post deserves it by definition.
First, you’re correct that it’s irrational to post to a forum you don’t enjoy. I’ll work on decreasing my akrasia.
But it’s hard not to comment on a non sequitur like the above. (Although probably futile because one who’s really not into a persuasion effort won’t do it well.) That posts are properly evaluated by suitability to the forum does not imply that a downvoted post deserves the downvote by definition! That’s a maladaptive view of the sort I’m amazed is so seldom criticized on this forum. Your view precludes (by definition yet) criticism of the evaluators’ biases, which do not advance the forum’s purpose. You would eschew not only absolute merits but also any objective consideration of the forum’s function.
A forum devoted to rationality, to be effective and honest, must assess and address the irrationalities in its own functioning. (This isn’t always “fun.”) To define a post that should be upvoted as one that is upvoted constitutes an enormous obstacle to rational function.
The point is that you would hardly be so severe on someone unless you disagreed strongly.
I disagree; a downvote is not ‘severe’.
The kind of nitpicking you engage in your post would ordinarily lead you to be downvoted
I disagree; meta-discussions often result in many upvotes.
It was that you treat discussion of karma as an unconditional wrong.
I do not, and have stated as much.
There’s no rational basis for throwing it in as an extra negative when the facts aren’t right.
If there is no point in downvoting incorrect facts, then I wonder what the downvote button is for.
You’re looking for trivial points to pick to downvote and to ignore the main point,
I disagree; your main point is that you are being unfairly downvoted, along with other posts critical of SI being downvoted unfairly, which I state again is untrue, afactual, incorrect, a false statement, a lie, a slander, etc.
Is that rational? Shouldn’t a rationalist community be more concerned with criticizing irrationalities in its own process?
Questioning the rationality of meta-meta-voting patterns achieves yet another downvote from me. Sorry.
I don’t follow your reasoning, here. Having read this particular thread, it does seem as though you are, in fact, going out of your way to criticize and downvote srdiamond. Yes, he has, in fact, made a few mistakes. Given, however, that the point of this post in general is about dissenting from the mainstream opinions of the LW crowd, and given the usual complaints about lack of dissent, I find your criticism of srdiamond strange, to say the least. I have, accordingly, upvoted a number of his comments.
As expected, my previous comment was downvoted almost immediately.
This would, for reference, be an example of the reason why some people believe LW is a cult that suppresses dissent. After all, it’s significantly easier to say that you disagree with something than it is to explain in detail why you disagree; just as it’s far easier to state agreement than to provide an insightful statement in agreement. Nonetheless, community norms dictate that unsubstantiated disagreements get modded down, while unsubstantiated agreements get modded up. Naturally, there’s more of the easy disagreement then the hard disagreement… that’s natural, since this is the Internet, and anyone can just post things here.
In any event, though, the end result is the same; people claim to want more dissent, but what they really mean is that they want to see more exceptionally clever and well-reasoned dissent. Any dissent that doesn’t seem at least half as clever as the argument it criticizes seems comparatively superfluous and trivial, and is marginalized at best. And, of course, any dissent that is demonstrably flawed in any way is aggressively attacked. That really is what people mean by suppression of dissent. It doesn’t really mean ‘downvoting arguments which are clever, but with which you personally disagree’… community norms here are a little better then that, and genuinely good arguments tend to get their due. In this case, it means, ‘downvoting arguments which aren’t very good, and with which you personally disagree, when you would at the same time upvote arguments that also aren’t very good, but with which you agree’. Given the nature of the community norms, someone who expresses dissent regularly, but without taking the effort to make each point in an insightful and terribly clever way, would tend to be downvoted repeatedly, and thus discouraged from making more dissent in the future… or, indeed, from posting here at all.
I don’t know if there’s a good solution to the problem. I would be inclined to suggest that, like with Reddit, people not downvote without leaving an explanation as to why. For instance, in addition to upvoting some of srdiamond’s earlier comments, I have also downvoted some of Rain’s, because a number of Rain’s comments in this thread fit the pattern of ‘poor arguments that support the community norms’, in the same sense that srdiamond’s fit the pattern of ‘poor arguments that violate the community norms’; my entire point here is that, in order to cultivate more intelligent dissent, there should be more of the latter and less of the former.
So, in other words, you automatically downvote anyone who explicitly mentions that they realize they are violating community norms by posting whatever it is they are posting, but feels that the content of their post is worth the probable downvotes? That IS fairly explicitly suppressing dissent, and I have downvoted you for doing so.
I don’t think it is suppression of dissent per se. It is more annoying behavior- it implies caring a lot about the karma system, and it is often not even the case when people say that they will actually get downvoted. If it is worth the probable downvote, then they can, you know, just take the downvote. If they want to point out that a view is unpopular they can just say that explicitly. It is also annoying to people like me, who are vocal about a number of issues that could be controversial here (e.g. criticizing Bayesianism, cryonics,, and whether intelligence explosions would be likely) and get voted up. More often than not, when someone claims they are getting downvoted for having unpopular opinions, they are getting downvoted in practice for having bad arguments or for being uncivil.
There are of course exceptions to this rule, and it is disturbing to note that the exceptions seem to be coming more common (see for example, this exchange where two comments are made with about the same quality of argument and about the same degree of uncivility- (“I’m starting to hate that you’ve become a fixture here.” v. “idiot”—but one of the comments is at +10 and the other is at −7.) Even presuming that there’s a real disagreement in quality or correctness of the arguments made, this suggests that uncivil remarks are tolerated more when people agree with the rest of the claim being made. That’s problematic. And this exchange was part of what prompted me to earlier suggest that we should be concerned if AGI risk might be becoming a mindkiller here. But even given that, issues like this seem not at all common.
Overall, if one needs to make a claim about one is going to be downvoted, one might even be correct, but it will often not be for the reasons one thinks it is.
More often than not, when someone claims they are getting downvoted for having unpopular opinions, they are getting downvoted in practice for having bad arguments or for being uncivil.
I don’t think it’s so much ‘caring a lot about the karma system’ per se, so much as the more general case of ‘caring about the approval and/or disapproval of one’s peers’. The former is fairly abstract, but the latter is a fairly deep ancestral motivation.
Like I said before, it’s clearly not much in the way of suppression. That said, given that, barring rare incidents of actual moderation, it is the only ‘suppression’ that occurs here, and since there is a view among various circles that there there is, in fact, suppression of dissent, and since people on the site frequently wonder why there are not more dissenting viewpoints here, and look for ways to find more… it is important to look at the issue in great depth, since it’s clearly an issue which is more significant than it seems on the surface.
[P]eople on the site frequently wonder why there are not more dissenting viewpoints here, and look for ways to find more… it is important to look at the issue in great depth, since it’s clearly an issue which is more significant than it seems on the surface.
Exactly right. But a group that claims to be dedicated to rationality loses all credibility when participants not only abstain from considering this question but adamantly resist it. The only upvote you received for your post—which makes this vital point—is mine.
This thread examines HoldenKarnofsky’s charge that SIAI isn’t exemplarily rational. As part of that examination, the broader LW environment on which it relies is germane. That much has been granted by most posters. But when the conversation reaches the touchstone of how the community expresses its approval and disapproval, the comments are declared illegitimate and downvoted (or if the comments are polite and hyper-correct, at least not upvoted).
The group harbors taboos. The following subjects are subject to them: the very possibility of nonevolved AI; karma and the group’s own process generally (an indespensable discussion ); and politics. (I’ve already posted a cite showing how the proscription on politics works, using an example the editors’ unwillingness to promote the post despite receiving almost 800 comments).
These defects in the rational process of LW help sustain Kardofsky’s argument that SIAI is not to be recommended based on the exemplary rationality of its staff and leadership. They are also the leadership of LW, and they have failed by refusing to lead the forum toward understanding the biases in its own process. They have fostered bias by creating the taboo on politics, as though you can rationally understand the world while dogmatically refusing even to consider a big part of it—because it “kills” your mind.
P.S. Thank you for the upvotes where you perceived bias.
...AGI risk might be becoming a mindkiller here...
Nah. If there is a mindkiller then it is the reputation system. Some of the hostility is the result of the overblown ego and attitude of some of its proponents and their general style of discussion. They created an insurmountable fortress that shields them from any criticism:
Troll: If you are so smart and rational, why don’t you fund yourself? Why isn’t your organisation sustainable?
SI/LW: Rationality is only aimed at expected winning.
Troll: But you don’t seem to be winning yet. Have you considered the possibility that your methods are suboptimal? Have you set yourself any goals, that you expect to be better at than less rational folks, to test your rationality?
SI/LW: Rationality is a caeteris paribus predictor of success.
Troll: Okay, but given that you spend a lot of time on refining your rationality, you must believe that it is worth it somehow? What makes you think so then?
SI/LW: We are trying to create a friendly artificial intelligence implement it and run the AI, at which point, if all goes well, we Win. We believe that rationality is very important to achieve that goal.
Troll: I see. But there surely must be some sub-goals that you anticipate to be able to solve and thereby test if your rationality skills are worth the effort?
SI/LW: Many of the problems related to navigating the Singularity have not yet been stated with mathematical precision, and the need for a precise statement of the problem is part of the problem.
Troll: Has there been any success in formalizing one of the problems that you need to solve?
SI/LW: There are some unpublished results that we have had no time to put into a coherent form yet.
Troll: It seems that there is no way for me to judge if it is worth it to read up on your writings on rationality.
SI/LW: If you want to more reliably achieve life success, I recommend inheriting a billion dollars or, failing that, being born+raised to have an excellent work ethic and low akrasia.
Troll: Awesome, I’ll do that next time. But for now, why would I bet on you or even trust that you know what you are talking about?
SI/LW: We spent a lot of time on debiasing techniques and thought long and hard about the relevant issues.
Troll: That seems to be insufficient evidence given the nature of your claims and that you are asking for money.
SI/LW: We make predictions. We make statements of confidence of events that merely sound startling. You are asking for evidence we couldn’t possibly be expected to be able to provide, even given that we are right.
Troll: But what do you anticipate to see if your ideas are right, is there any possibility to update on evidence?
SI/LW: No, once the evidence is available it will be too late.
Troll: But then why would I trust you instead of those experts who tell me that you are wrong?
SI/LW: You will soon learn that your smart friends and experts are not remotely close to the rationality standards of SI/LW, and you will no longer think it anywhere near as plausible that their differing opinion is because they know some incredible secret knowledge you don’t.
Troll: But you have never achieved anything when it comes to AI, why would I trust your reasoning on the topic?
SI/LW: That is magical thinking about prestige. Prestige is not a good indicator of quality.
Troll: You won’t convince me without providing further evidence.
SI/LW: That is a fully general counterargument you can use to discount any conclusion.
First, none of this dissent has been suppressed in any real sense. It’s still available to be read and discussed by those who desire reading and discussing such things. The current moderation policy has currently only kicked in when things have gotten largely out of hand—which is not the case here, yet.
Second, net karma isn’t a fine enough tool to express amount of detail you want it to express. The net comment on your previous comment is currently −2; congrats, you’ve managed to irritate less than a tenth of one percent of LW (presuming the real karma is something like −2/+0 or −3/+1)!
Third, the solution you propose hasn’t been implemented anywhere that I know of. Reddit’s suggested community norm (which does not apply to every subreddit) suggests considering posting constructive criticism only when one thinks it will actually help the poster improve. That’s not really the case much of the time, at least on the subreddits I frequent, and it’s certainly not the case often here.
Fourth, the solution you propose would, if implemented, decrease the signal-to-noise ratio of LW further.
Fifth, reddit’s suggested community norm also says “[Don’t c]omplain about downvotes on your posts”. Therefore, I wonder how much you really think reddit is doing the community voting norm thing correctly.
First; downvoted comments are available to be read, yes; but the default settings hide comments with 2 or more net downvotes. This is enough to be reasonably considered ‘suppression’. It’s not all that much suppression, true, but it is suppression… and it is enough to discourage dissent. Actual moderation of comments is a separate issue entirely, and not one which I will address here.
Second; when I posted my reply, and as of this moment, my original comment was at −3. I agree; net karma isn’t actually a huge deal, except that it is, as has been observed, the most prevalent means by which dissent is suppressed. In my case, at least, ‘this will probably get downvoted’ feels like a reason to not post something. Not much of a reason, true, but enough of one that I can identify the feeling of reluctance.
Third; on the subreddits I follow (admittedly a shallow sampling), I have frequently seen comments explaining downvotes, sometimes in response to a request specifically for such feedback, but just as often not. I suspect that this has a lot to do with the “Down-voting? Please leave an explanation in the comments.” message that appears when mousing over the downvote icon. I am aware that this is not universal across Reddit, but on the subreddits I follow, it seems to work reasonably well.
Fourth; I agree that this is a possible result. Like I said before, I’m not sure if there is a good solution to this problem, but I do feel that it’d result in a better state then that which currently exists, if people would more explicitly explain why they downvote when they choose to do so. That said, given that downvoted comments are hidden from default view anyway, and that those who choose to do so can easily ignore such comments, I don’t think it’d have all that much effect on the signal/noise ratio.
Fifth; on the subreddits I follow, it seems as though there is less in the way of complaints about downvotes, and more honest inquiries as to why a comment has been downvoted; such questions seem to usually receive honest responses. This may be anomalous within Reddit as a whole; as I said before, my own experience with Reddit is a shallow sampling.
I don’t know if there’s a good solution to the problem. I would be inclined to suggest that, like with Reddit, people not downvote without leaving an explanation as to why. For instance, in addition to upvoting some of srdiamond’s earlier comments, I have also downvoted some of Rain’s, because a number of Rain’s comments in this thread fit the pattern of ‘poor arguments that support the community norms’, in the same sense that srdiamond’s fit the pattern of ‘poor arguments that violate the community norms’; my entire point here is that, in order to cultivate more intelligent dissent, there should be more of the latter and less of the former.
Perhaps the solution is not to worry so much about my bad contrarian arguments being downvoted as to assure that bad “establishment” arguments are downvoted—as in Rain’s case, they aren’t. Regurgitation of arguments others have repeatedly stated should also be downvoted, no matter how good the arguments.
The reason to think an emphasis on more criticism of Rain rather than less criticism of me is that after I err, it’s a difficult argument to establish that my error wasn’t serious enough to avoid downvote. But when Rain negligently or intentionally misses the entire point, there’s less question that he isn’t benefiting the discussion. It’s easier to convict of fallacy than to defend based on the fallacy being relatively trivial. There’s a problem in that the two determinations are somewhat inter-related, but it doesn’t eliminate the contrast.
Increasing the number of downvotes would deflate the significance of any single downvote and would probably foster more dissent. This balance may be subject to easy institutional control. Posters are allotted downvotes based on their karma, while the karma requirements for upvotes are easily satisfied, if they exist. This amounts to encouraging upvotes relative to downvotes, with the result that many bad posts are voted up and some decent posts suffer the disproportionate wrath of extreme partisans. (Note that Rain, a donor, is a partisan of SIAI.)
The editors should experiment with increasing the downvote allowance. I favor equal availability of downvotes and upvotes as optimal (but this should be thought through more carefully).
Look, Rain, this is an Internet ongoing discussion. Nobody says everything precisely right. The point is that you would hardly be so severe on someone unless you disagreed strongly. You couldn’t be, because nobody would satisfy your accuracy demands. The kind of nitpicking you engage in your post would ordinarily lead you to be downvoted—and you should be, although I won’t commit the rudeness of so doing in a discussion.
The point wasn’t that you downvote when the only thing wrong with the comment is discussion of karma. It was that you treat discussion of karma as an unconditional wrong. So you exploited weaknesses in my phrasing to ignore what I think was obviously the point—that marking down for the bare mention of karma (even if it doesn’t produce a downvote in each case) is an irrational policy, when karma is at the heart of the discussion. There’s no rational basis for throwing it in as an extra negative when the facts aren’t right.
You’re looking for trivial points to pick to downvote and to ignore the main point, which was your counting mention of karma a negative, without regard to the subject, is an irrational policy. If we were on reversed sides, your nitpicking and evasion would itself be marked down. As matters stand, you don’t even realize you’re acting in a biased fashion, and readers either don’t know or don’t care.
Is that rational? Shouldn’t a rationalist community be more concerned with criticizing irrationalities in its own process?
Having been a subject of both a relatively large upvote and a relatively large downvote in the last couple of weeks, I still think that the worst thing one can do is to complain about censorship or karma. The posts and comments on any forum aren’t judged on their “objective merits” (because there is no such thing), but on its suitability for the forum in question. If you have been downvoted, your post deserves it by definition. You can politely inquire about the reasons, but people are not required to explain themselves. As for rationality, I question whether it is rational to post on a forum if you are not having fun there. Take it easy.
First, you’re correct that it’s irrational to post to a forum you don’t enjoy. I’ll work on decreasing my akrasia.
But it’s hard not to comment on a non sequitur like the above. (Although probably futile because one who’s really not into a persuasion effort won’t do it well.) That posts are properly evaluated by suitability to the forum does not imply that a downvoted post deserves the downvote by definition! That’s a maladaptive view of the sort I’m amazed is so seldom criticized on this forum. Your view precludes (by definition yet) criticism of the evaluators’ biases, which do not advance the forum’s purpose. You would eschew not only absolute merits but also any objective consideration of the forum’s function.
A forum devoted to rationality, to be effective and honest, must assess and address the irrationalities in its own functioning. (This isn’t always “fun.”) To define a post that should be upvoted as one that is upvoted constitutes an enormous obstacle to rational function.
I disagree; a downvote is not ‘severe’.
I disagree; meta-discussions often result in many upvotes.
I do not, and have stated as much.
If there is no point in downvoting incorrect facts, then I wonder what the downvote button is for.
I disagree; your main point is that you are being unfairly downvoted, along with other posts critical of SI being downvoted unfairly, which I state again is untrue, afactual, incorrect, a false statement, a lie, a slander, etc.
Questioning the rationality of meta-meta-voting patterns achieves yet another downvote from me. Sorry.
I don’t follow your reasoning, here. Having read this particular thread, it does seem as though you are, in fact, going out of your way to criticize and downvote srdiamond. Yes, he has, in fact, made a few mistakes. Given, however, that the point of this post in general is about dissenting from the mainstream opinions of the LW crowd, and given the usual complaints about lack of dissent, I find your criticism of srdiamond strange, to say the least. I have, accordingly, upvoted a number of his comments.
As expected, my previous comment was downvoted almost immediately.
This would, for reference, be an example of the reason why some people believe LW is a cult that suppresses dissent. After all, it’s significantly easier to say that you disagree with something than it is to explain in detail why you disagree; just as it’s far easier to state agreement than to provide an insightful statement in agreement. Nonetheless, community norms dictate that unsubstantiated disagreements get modded down, while unsubstantiated agreements get modded up. Naturally, there’s more of the easy disagreement then the hard disagreement… that’s natural, since this is the Internet, and anyone can just post things here.
In any event, though, the end result is the same; people claim to want more dissent, but what they really mean is that they want to see more exceptionally clever and well-reasoned dissent. Any dissent that doesn’t seem at least half as clever as the argument it criticizes seems comparatively superfluous and trivial, and is marginalized at best. And, of course, any dissent that is demonstrably flawed in any way is aggressively attacked. That really is what people mean by suppression of dissent. It doesn’t really mean ‘downvoting arguments which are clever, but with which you personally disagree’… community norms here are a little better then that, and genuinely good arguments tend to get their due. In this case, it means, ‘downvoting arguments which aren’t very good, and with which you personally disagree, when you would at the same time upvote arguments that also aren’t very good, but with which you agree’. Given the nature of the community norms, someone who expresses dissent regularly, but without taking the effort to make each point in an insightful and terribly clever way, would tend to be downvoted repeatedly, and thus discouraged from making more dissent in the future… or, indeed, from posting here at all.
I don’t know if there’s a good solution to the problem. I would be inclined to suggest that, like with Reddit, people not downvote without leaving an explanation as to why. For instance, in addition to upvoting some of srdiamond’s earlier comments, I have also downvoted some of Rain’s, because a number of Rain’s comments in this thread fit the pattern of ‘poor arguments that support the community norms’, in the same sense that srdiamond’s fit the pattern of ‘poor arguments that violate the community norms’; my entire point here is that, in order to cultivate more intelligent dissent, there should be more of the latter and less of the former.
I downvote any post that says “I expect I’ll get downvoted for this, but...” or “the fact that I was downvoted proves I’m right!”
I’m fond of downvoting “I dare you to downvote this!”
So, in other words, you automatically downvote anyone who explicitly mentions that they realize they are violating community norms by posting whatever it is they are posting, but feels that the content of their post is worth the probable downvotes? That IS fairly explicitly suppressing dissent, and I have downvoted you for doing so.
I don’t think it is suppression of dissent per se. It is more annoying behavior- it implies caring a lot about the karma system, and it is often not even the case when people say that they will actually get downvoted. If it is worth the probable downvote, then they can, you know, just take the downvote. If they want to point out that a view is unpopular they can just say that explicitly. It is also annoying to people like me, who are vocal about a number of issues that could be controversial here (e.g. criticizing Bayesianism, cryonics,, and whether intelligence explosions would be likely) and get voted up. More often than not, when someone claims they are getting downvoted for having unpopular opinions, they are getting downvoted in practice for having bad arguments or for being uncivil.
There are of course exceptions to this rule, and it is disturbing to note that the exceptions seem to be coming more common (see for example, this exchange where two comments are made with about the same quality of argument and about the same degree of uncivility- (“I’m starting to hate that you’ve become a fixture here.” v. “idiot”—but one of the comments is at +10 and the other is at −7.) Even presuming that there’s a real disagreement in quality or correctness of the arguments made, this suggests that uncivil remarks are tolerated more when people agree with the rest of the claim being made. That’s problematic. And this exchange was part of what prompted me to earlier suggest that we should be concerned if AGI risk might be becoming a mindkiller here. But even given that, issues like this seem not at all common.
Overall, if one needs to make a claim about one is going to be downvoted, one might even be correct, but it will often not be for the reasons one thinks it is.
Bears repeating.
I don’t think it’s so much ‘caring a lot about the karma system’ per se, so much as the more general case of ‘caring about the approval and/or disapproval of one’s peers’. The former is fairly abstract, but the latter is a fairly deep ancestral motivation.
Like I said before, it’s clearly not much in the way of suppression. That said, given that, barring rare incidents of actual moderation, it is the only ‘suppression’ that occurs here, and since there is a view among various circles that there there is, in fact, suppression of dissent, and since people on the site frequently wonder why there are not more dissenting viewpoints here, and look for ways to find more… it is important to look at the issue in great depth, since it’s clearly an issue which is more significant than it seems on the surface.
Exactly right. But a group that claims to be dedicated to rationality loses all credibility when participants not only abstain from considering this question but adamantly resist it. The only upvote you received for your post—which makes this vital point—is mine.
This thread examines HoldenKarnofsky’s charge that SIAI isn’t exemplarily rational. As part of that examination, the broader LW environment on which it relies is germane. That much has been granted by most posters. But when the conversation reaches the touchstone of how the community expresses its approval and disapproval, the comments are declared illegitimate and downvoted (or if the comments are polite and hyper-correct, at least not upvoted).
The group harbors taboos. The following subjects are subject to them: the very possibility of nonevolved AI; karma and the group’s own process generally (an indespensable discussion ); and politics. (I’ve already posted a cite showing how the proscription on politics works, using an example the editors’ unwillingness to promote the post despite receiving almost 800 comments).
These defects in the rational process of LW help sustain Kardofsky’s argument that SIAI is not to be recommended based on the exemplary rationality of its staff and leadership. They are also the leadership of LW, and they have failed by refusing to lead the forum toward understanding the biases in its own process. They have fostered bias by creating the taboo on politics, as though you can rationally understand the world while dogmatically refusing even to consider a big part of it—because it “kills” your mind.
P.S. Thank you for the upvotes where you perceived bias.
Nah. If there is a mindkiller then it is the reputation system. Some of the hostility is the result of the overblown ego and attitude of some of its proponents and their general style of discussion. They created an insurmountable fortress that shields them from any criticism:
Troll: If you are so smart and rational, why don’t you fund yourself? Why isn’t your organisation sustainable?
SI/LW: Rationality is only aimed at expected winning.
Troll: But you don’t seem to be winning yet. Have you considered the possibility that your methods are suboptimal? Have you set yourself any goals, that you expect to be better at than less rational folks, to test your rationality?
SI/LW: Rationality is a caeteris paribus predictor of success.
Troll: Okay, but given that you spend a lot of time on refining your rationality, you must believe that it is worth it somehow? What makes you think so then?
SI/LW: We are trying to create a friendly artificial intelligence implement it and run the AI, at which point, if all goes well, we Win. We believe that rationality is very important to achieve that goal.
Troll: I see. But there surely must be some sub-goals that you anticipate to be able to solve and thereby test if your rationality skills are worth the effort?
SI/LW: Many of the problems related to navigating the Singularity have not yet been stated with mathematical precision, and the need for a precise statement of the problem is part of the problem.
Troll: Has there been any success in formalizing one of the problems that you need to solve?
SI/LW: There are some unpublished results that we have had no time to put into a coherent form yet.
Troll: It seems that there is no way for me to judge if it is worth it to read up on your writings on rationality.
SI/LW: If you want to more reliably achieve life success, I recommend inheriting a billion dollars or, failing that, being born+raised to have an excellent work ethic and low akrasia.
Troll: Awesome, I’ll do that next time. But for now, why would I bet on you or even trust that you know what you are talking about?
SI/LW: We spent a lot of time on debiasing techniques and thought long and hard about the relevant issues.
Troll: That seems to be insufficient evidence given the nature of your claims and that you are asking for money.
SI/LW: We make predictions. We make statements of confidence of events that merely sound startling. You are asking for evidence we couldn’t possibly be expected to be able to provide, even given that we are right.
Troll: But what do you anticipate to see if your ideas are right, is there any possibility to update on evidence?
SI/LW: No, once the evidence is available it will be too late.
Troll: But then why would I trust you instead of those experts who tell me that you are wrong?
SI/LW: You will soon learn that your smart friends and experts are not remotely close to the rationality standards of SI/LW, and you will no longer think it anywhere near as plausible that their differing opinion is because they know some incredible secret knowledge you don’t.
Troll: But you have never achieved anything when it comes to AI, why would I trust your reasoning on the topic?
SI/LW: That is magical thinking about prestige. Prestige is not a good indicator of quality.
Troll: You won’t convince me without providing further evidence.
SI/LW: That is a fully general counterargument you can use to discount any conclusion.
The last exchange was hilarious. This is parody, right?
Downvoted for downvoting downvoting of downvoting of downvoting.
If you do the same to this comment, we can enter a stable loop!
First, none of this dissent has been suppressed in any real sense. It’s still available to be read and discussed by those who desire reading and discussing such things. The current moderation policy has currently only kicked in when things have gotten largely out of hand—which is not the case here, yet.
Second, net karma isn’t a fine enough tool to express amount of detail you want it to express. The net comment on your previous comment is currently −2; congrats, you’ve managed to irritate less than a tenth of one percent of LW (presuming the real karma is something like −2/+0 or −3/+1)!
Third, the solution you propose hasn’t been implemented anywhere that I know of. Reddit’s suggested community norm (which does not apply to every subreddit) suggests considering posting constructive criticism only when one thinks it will actually help the poster improve. That’s not really the case much of the time, at least on the subreddits I frequent, and it’s certainly not the case often here.
Fourth, the solution you propose would, if implemented, decrease the signal-to-noise ratio of LW further.
Fifth, reddit’s suggested community norm also says “[Don’t c]omplain about downvotes on your posts”. Therefore, I wonder how much you really think reddit is doing the community voting norm thing correctly.
First; downvoted comments are available to be read, yes; but the default settings hide comments with 2 or more net downvotes. This is enough to be reasonably considered ‘suppression’. It’s not all that much suppression, true, but it is suppression… and it is enough to discourage dissent. Actual moderation of comments is a separate issue entirely, and not one which I will address here.
Second; when I posted my reply, and as of this moment, my original comment was at −3. I agree; net karma isn’t actually a huge deal, except that it is, as has been observed, the most prevalent means by which dissent is suppressed. In my case, at least, ‘this will probably get downvoted’ feels like a reason to not post something. Not much of a reason, true, but enough of one that I can identify the feeling of reluctance.
Third; on the subreddits I follow (admittedly a shallow sampling), I have frequently seen comments explaining downvotes, sometimes in response to a request specifically for such feedback, but just as often not. I suspect that this has a lot to do with the “Down-voting? Please leave an explanation in the comments.” message that appears when mousing over the downvote icon. I am aware that this is not universal across Reddit, but on the subreddits I follow, it seems to work reasonably well.
Fourth; I agree that this is a possible result. Like I said before, I’m not sure if there is a good solution to this problem, but I do feel that it’d result in a better state then that which currently exists, if people would more explicitly explain why they downvote when they choose to do so. That said, given that downvoted comments are hidden from default view anyway, and that those who choose to do so can easily ignore such comments, I don’t think it’d have all that much effect on the signal/noise ratio.
Fifth; on the subreddits I follow, it seems as though there is less in the way of complaints about downvotes, and more honest inquiries as to why a comment has been downvoted; such questions seem to usually receive honest responses. This may be anomalous within Reddit as a whole; as I said before, my own experience with Reddit is a shallow sampling.
Perhaps the solution is not to worry so much about my bad contrarian arguments being downvoted as to assure that bad “establishment” arguments are downvoted—as in Rain’s case, they aren’t. Regurgitation of arguments others have repeatedly stated should also be downvoted, no matter how good the arguments.
The reason to think an emphasis on more criticism of Rain rather than less criticism of me is that after I err, it’s a difficult argument to establish that my error wasn’t serious enough to avoid downvote. But when Rain negligently or intentionally misses the entire point, there’s less question that he isn’t benefiting the discussion. It’s easier to convict of fallacy than to defend based on the fallacy being relatively trivial. There’s a problem in that the two determinations are somewhat inter-related, but it doesn’t eliminate the contrast.
Increasing the number of downvotes would deflate the significance of any single downvote and would probably foster more dissent. This balance may be subject to easy institutional control. Posters are allotted downvotes based on their karma, while the karma requirements for upvotes are easily satisfied, if they exist. This amounts to encouraging upvotes relative to downvotes, with the result that many bad posts are voted up and some decent posts suffer the disproportionate wrath of extreme partisans. (Note that Rain, a donor, is a partisan of SIAI.)
The editors should experiment with increasing the downvote allowance. I favor equal availability of downvotes and upvotes as optimal (but this should be thought through more carefully).